[Note: this article is written from the standpoint of the solar charge controller. It affects all circuits to some degree and is not limited to solar. – secessus]
In typical installs the charge controller doesn't know the actual battery voltage; it only knows what it sees on its own BATTERY terminals where the wires are connected. Wire sizes, junctions, current, etc, can cause this voltage reading to be inaccurate. Which can throw off charging. Which can affect battery perfornance and/or longevity.
Extreme example: During heavy charging the controller might see 14v instead of the battery's 13v. This 14v → 13v difference is called voltage sag. A similar thing happens in reverse when consuming power. The battery might be 12.5v but an inverter running a big load might “see” 11.5v at its own terminals and shut off prematurely.1). It also makes it difficult to assess state of charge of lead batteries by voltage.
Sag presents special challenges when charging lithium battery banks. Since the majority of lithium's usable power is found in an extremely narrow voltage range2) sag can result in wildly unpredictable/inconsistent charging.
This is the majority position since most are unaware of the issue, or don't judge it to be important. And it may not be a problem since since
All other things being equal, a shorter wiring run will have less sag than a longer run. For this reason charging sources are mounted as close to the battery bank as possible.
Once the length of the circuit is established voltage drop along the circuit can be minimized:
If your sag is consistent you could adjust your setpoints upwards. If you have 0.2v sag and want 14.0v you would configure the setpoint to 14.2v.
Or if your battery manufacturer offers a range of charging voltages like 14.0v - 14.4v you could configure 14.4v and know it will be ≤14.4v in real conditions.
A voltage sense wire is a separate wiring circuit running from the battery bank to the controller's voltage sense terminals (if so equipped).
Since the circuit isn't carrying any real current there will be very little sag and even quite thin wires can be used to get an accurate voltage reading. The controller will then adjust output voltage to put the desired setpoint voltage into the bank.
Example: The absorption setpoint (Vabs) is 14.0. The sensed battery voltage is only 13.8v, so the controller actually puts out 14.2v.3)
Some controllers use networked battery monitors to report back what the battery voltage is in reality.
The controller may also have a voltage calibration setpoint. If your sag is consistently -0.2v you could input that calibration. You put in 14.0v or whatever you want and the controller increases voltage by 0.2v.
Unfortunately, sag varies with current so the user might need to figure out average sag in their use case.