DRAFT
This should be easy, right?
time charging x charge rate in A = Ah replaced in the bank
We know the time part because we're the ones driving. But the charge rate part can be hard to predict with any certainty; this is especially true for combiner (“split charger”) charging.
In most cases the battery itself will be the limiting factor; it wants what it wants.
These are common maxxes for batteries discharged to their deepest normal state of charge:
Since current is a function of C, bigger banks will pull more current than small ones. At 0.2C a 100Ah bank will pull 20A and a 200Ah bank 40A.
Banks will pull more current at lower states of charge and less current at higher states of charge. This affects combiners more than DC-DC chargers (see below).
Sometimes other factors will intervene:
DC-DC charging rates are more predictable because of how they work. A 20A DC-DC will likely pump 20A into the bank most of the time. So 1/2 hour driving x 20A = 10Ah returned to the battery bank. The price of this predictability (and other features) is… price.
If you need predictable charging then DC-DC is likely the answer. There are other scenarios where DC-DC is effectively required.
This is the tough one. We know roughly where the charge rate will start out for discharged batteries (≤0.33C) but the taper complicates things. All the batteries would eventually charge to full2) given enough time, so shorter charging periods are what interest us. Let's assume a short drive = 30 minutes.
NOTE: this section contains guesswork from observations and theory. You will learn how your particular setup behaves in actual use.
In most installs this variability isn't a major issue: