Hi,
Can anyone help clarify if what I am experiencing is an equipment fault or cable length issue or possibly something else?
- 120W Portable Panel with regulator/controller on Panel Itself
- AGM Battery in Car
- 10m cable length from regulator to battery (have tried 5m also)
Issue is the regulator has a LCD with the battery voltage displayed (and can also have custom Absorb and Float voltages input). The voltage reading on the regulator is out by up to 1.0v. For example using my trusted voltmeter and watt meter attached 10cm from battery the following readings are taken:
Battery Only Connected to Panel (All looks good):
- Voltmeter (0cm) = 12.80
- Wattmeter (10cm) = 12.79
- Regulator (10m) = 12.7
Then once connected to Solar Power as well and charging starts (Absorb example with regulator set to do 14,4):
- Voltmeter (0cm) = 13.40
- Wattmeter (10cm) = 13.39
- Regulator (10m) = 14.40
The regulator battery reading is out by up to 1.0v. To get the regulator to actually bring the battery up to 14.4 in this example I have to set the absorb value to 15.4 (and end up with 14.4 at the battery terminals).
So is the regulator battery sensor/readout most likely just not accurate while charging... or is it due to cable lengths... Confused mainly over if it was cable length / voltage drop wouldn't the regulator readout have the battery reading lower than the terminals instead of higher?
Thanks.
Can anyone help clarify if what I am experiencing is an equipment fault or cable length issue or possibly something else?
- 120W Portable Panel with regulator/controller on Panel Itself
- AGM Battery in Car
- 10m cable length from regulator to battery (have tried 5m also)
Issue is the regulator has a LCD with the battery voltage displayed (and can also have custom Absorb and Float voltages input). The voltage reading on the regulator is out by up to 1.0v. For example using my trusted voltmeter and watt meter attached 10cm from battery the following readings are taken:
Battery Only Connected to Panel (All looks good):
- Voltmeter (0cm) = 12.80
- Wattmeter (10cm) = 12.79
- Regulator (10m) = 12.7
Then once connected to Solar Power as well and charging starts (Absorb example with regulator set to do 14,4):
- Voltmeter (0cm) = 13.40
- Wattmeter (10cm) = 13.39
- Regulator (10m) = 14.40
The regulator battery reading is out by up to 1.0v. To get the regulator to actually bring the battery up to 14.4 in this example I have to set the absorb value to 15.4 (and end up with 14.4 at the battery terminals).
So is the regulator battery sensor/readout most likely just not accurate while charging... or is it due to cable lengths... Confused mainly over if it was cable length / voltage drop wouldn't the regulator readout have the battery reading lower than the terminals instead of higher?
Thanks.
Comment