I haven’t had any luck finding a differentiation in the topic of volts vs Amps as they relate to charging. I understand the formula for wattage, and perhaps I’m looking for a deeper answer than what it is, so maybe someone could extrapolate...here are some things to work from:

nominal voltage is 12.7, float is 13.7, and (bulk) is 14.4v. Let’s go with 190ah battery, with bulk charging being 10%, or something like 274w. Put the same amperage in float voltage and you get about 260w. That’s Like negligible change in power, yet voltage is the only metric used when talking about charging a battery. Let’s say I use 20a at 13.7v (instead of 19a @ 14.4v), which yields the same ~274w.

The power being transferred into the battery is the same. I can’t imagine 1a makes that that much of a difference in heat transfer that it has spawned a whole indistry worth of theories and technologies geared towards optimally charge a battery.

Why is amperage left out of the dialogue almost completely when it comes to discussing charging a battery?

nominal voltage is 12.7, float is 13.7, and (bulk) is 14.4v. Let’s go with 190ah battery, with bulk charging being 10%, or something like 274w. Put the same amperage in float voltage and you get about 260w. That’s Like negligible change in power, yet voltage is the only metric used when talking about charging a battery. Let’s say I use 20a at 13.7v (instead of 19a @ 14.4v), which yields the same ~274w.

The power being transferred into the battery is the same. I can’t imagine 1a makes that that much of a difference in heat transfer that it has spawned a whole indistry worth of theories and technologies geared towards optimally charge a battery.

Why is amperage left out of the dialogue almost completely when it comes to discussing charging a battery?

## Comment