This is an old revision of the document!
[Note: this article is written from the standpoint of the solar charge controller. It affects all circuits to some degree and is not limited to solar. – secessus]
In typical installs the charge controller doesn't know the actual battery voltage; it only knows what it sees on its own BATTERY terminals where the wires are connected. Wire sizes, junctions, current, etc, can cause this voltage reading to be inaccurate. Which can throw off charging. Which can affect battery perfornance and/or longevity.
Extreme example: During heavy charging the controller might see 14v instead of the battery's 13v. This 14v → 13v difference is called voltage sag. A similar thing happens in reverse when consuming power. The battery might be 12.5v but an inverter running a big load might “see” 11.5v at its own terminals and shut off prematurely.1). It also makes it difficult to assess state of charge of lead batteries by voltage.
Sag presents special challenges when charging lithium battery banks, since the majority of their usable power found in a very narrow range only about 0.5v wide. Voltage sag when charging with lithium can result in unpredictable/inconsistent charging.
This is the majority position since most are unaware of the issue, or don't judge it to be important. And it may not be a problem since since
A shorter wiring run will have less sag than a longer run. For this reason charging sources are mounted as close to the battery bank as possible.
Once the length of the circuit is established voltage drop along the circuit can be minimized:
use a separate voltage sensing circuit (just a pair of wires to the battery) for controllers that have this feature. Since it's not carrying real current it's not thrown off by sag
networked shunt
use a controller that talks to a shunt at the battery
have a voltage calibration setting - sag varies with current so the user might need to figure out average sag in their use case.