Newbie Help... Controller Battery Voltage Reading

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • Quozl
    Junior Member
    • Mar 2016
    • 7

    Newbie Help... Controller Battery Voltage Reading

    Hi,

    Can anyone help clarify if what I am experiencing is an equipment fault or cable length issue or possibly something else?

    - 120W Portable Panel with regulator/controller on Panel Itself
    - AGM Battery in Car
    - 10m cable length from regulator to battery (have tried 5m also)

    Issue is the regulator has a LCD with the battery voltage displayed (and can also have custom Absorb and Float voltages input). The voltage reading on the regulator is out by up to 1.0v. For example using my trusted voltmeter and watt meter attached 10cm from battery the following readings are taken:

    Battery Only Connected to Panel (All looks good):
    - Voltmeter (0cm) = 12.80
    - Wattmeter (10cm) = 12.79
    - Regulator (10m) = 12.7

    Then once connected to Solar Power as well and charging starts (Absorb example with regulator set to do 14,4):
    - Voltmeter (0cm) = 13.40
    - Wattmeter (10cm) = 13.39
    - Regulator (10m) = 14.40

    The regulator battery reading is out by up to 1.0v. To get the regulator to actually bring the battery up to 14.4 in this example I have to set the absorb value to 15.4 (and end up with 14.4 at the battery terminals).

    So is the regulator battery sensor/readout most likely just not accurate while charging... or is it due to cable lengths... Confused mainly over if it was cable length / voltage drop wouldn't the regulator readout have the battery reading lower than the terminals instead of higher?

    Thanks.
  • LETitROLL
    Solar Fanatic
    • May 2014
    • 286

    #2
    Originally posted by Quozl
    Hi,

    Can anyone help clarify if what I am experiencing is an equipment fault or cable length issue or possibly something else?

    - 120W Portable Panel with regulator/controller on Panel Itself
    - AGM Battery in Car
    - 10m cable length from regulator to battery (have tried 5m also)

    Issue is the regulator has a LCD with the battery voltage displayed (and can also have custom Absorb and Float voltages input). The voltage reading on the regulator is out by up to 1.0v. For example using my trusted voltmeter and watt meter attached 10cm from battery the following readings are taken:

    Battery Only Connected to Panel (All looks good):
    - Voltmeter (0cm) = 12.80
    - Wattmeter (10cm) = 12.79
    - Regulator (10m) = 12.7

    Then once connected to Solar Power as well and charging starts (Absorb example with regulator set to do 14,4):
    - Voltmeter (0cm) = 13.40
    - Wattmeter (10cm) = 13.39
    - Regulator (10m) = 14.40

    The regulator battery reading is out by up to 1.0v. To get the regulator to actually bring the battery up to 14.4 in this example I have to set the absorb value to 15.4 (and end up with 14.4 at the battery terminals).

    So is the regulator battery sensor/readout most likely just not accurate while charging... or is it due to cable lengths... Confused mainly over if it was cable length / voltage drop wouldn't the regulator readout have the battery reading lower than the terminals instead of higher?

    Thanks.
    We can eliminate some of the confusion with math, 10m of 14awg wire flowing 6A (approx max for most 100-120w panels) will have a voltage drop of .99v, so if you are using medium size (diameter) wire and not very large, it is pretty sure that is what the issue is. You also said you tried 5m wire, were your results exactly the same or slightly different? What size wire are you using?
    Last edited by LETitROLL; 03-20-2016, 10:58 AM.

    Comment

    • LETitROLL
      Solar Fanatic
      • May 2014
      • 286

      #3
      In a common system the controller is near the battery, controller puts out 14.4v short wire to battery = minimal voltage drop battery gets near 14.4v, in your system because the controller is with the panel, it puts out 14.4v then you send it down a long wire with losses and the battery gets something more noticeably less.

      Comment

      • Sunking
        Solar Fanatic
        • Feb 2010
        • 23301

        #4
        Originally posted by Quozl
        Then once connected to Solar Power as well and charging starts (Absorb example with regulator set to do 14,4):
        - Voltmeter (0cm) = 13.40
        - Wattmeter (10cm) = 13.39
        - Regulator (10m) = 14.40
        You already answered your question, you wrote the answer down. Your wattmetter is jibber jabber and meaningless. If you are seeing 14.4 volts measured with a DMM at the controller output, and 13.4 volts on the battery post, you are loosing 1 volt on you long small cables. Anything wrong? Well yes and no. Perfectly normal to have voltage loss with current flowing through wire resistance. On the other hand poor workmanship and design to allow 7% voltage drop from using too small of a cable or too long of a cable. But once the battery is charged up and no current flows, voltage will be eaual

        There might be one other possibility. you just do not know how battery chargers work. In Absorb, and you set the cut-off voltage to 14.4, does not mean you will ever see 14.4 volts.
        MSEE, PE

        Comment

        • Quozl
          Junior Member
          • Mar 2016
          • 7

          #5
          Originally posted by Sunking

          If you are seeing 14.4 volts measured with a DMM at the controller output, and 13.4 volts on the battery post, you are loosing 1 volt on you long small cables.

          There might be one other possibility. you just do not know how battery chargers work. In Absorb, and you set the cut-off voltage to 14.4, does not mean you will ever see 14.4 volts.
          Yes to the first statement except I am not measuring the output at the controller side with a DMM its simply the controller itself with an inbuilt battery monitor showing the real time battery voltage on the LCD.

          In regards to Absorb etc that was just an example... its the exactly the same with 5m of cable or in float. without power connected the controllers battery monitor matches the DMM battery post readouts but with power added in the controllers battery monitor is 1v higher than the battery posts. Just for example if I want Absorb to be at 14.4 and float at 13.8 at the battery terminals I need to set the controller to 15.4 and 14.8 respectively to achieve these voltages at the battery terminals.

          The 1v loss due to cable length makes sense but it confuses me when the battery sensor on the controller gives a battery reading 1v higher than the terminals... is it not meant to be reading the real time battery voltage in which case the voltage drop would be from battery to controller 10m away and 1v less??? Or while charging is the controllers battery voltage sensor simply indicating its output voltage as opposed to the actual battery voltage hence showing the loss in the other direction.

          Comment

          • Quozl
            Junior Member
            • Mar 2016
            • 7

            #6
            Originally posted by Sunking

            Your wattmetter is jibber jabber and meaningless.
            I also do not understand what you mean by this. The in-line watt meter 10cm from the battery terminals in these tests is simply to give me another battery terminal voltage cross check as placing the DMM on the terminals all the time is a pain. I have found the watt meters battery voltage readout to be exactly -0.01 of whatever the DMM reading is while idle or under charge. So the watt meter appears as accurate for a readout / tracking of battery voltage as the DMM is.

            Comment

            • Sunking
              Solar Fanatic
              • Feb 2010
              • 23301

              #7
              Let's see if we can get on the same page. If no current is flowing, there is no voltage loss. So if you were disconnect the panels (or short them out) from the controller, with a DMM you will measure the exact same voltage at the terminals of the controller output and battery term post. If any other meter reads something different than what the DMM reads it is in error assuming the DMM is correct. That is where you should start.

              When current flows things change. One formula for Voltage = Current x Resistance. Wire has resistance, and the resistance is determined by the size of the wire and length. Smaller the wire, the higher the resistance. Longer the wire, the higher the resistance. Wire resistance is pretty much a fixed variable of some value. Current is the only thing that changes. so if there is 0 amps flowing, you have 0 volts being dropped along the length of the wire because 0 x R = 0 everyday of the week where R = the resistance of the wire.

              So let's say the wire resistance = .1 Ohms. If you have 10 amps of current flowing then 10 amps x .1 Ohm's = 1 volt. So if you had 14.4 volts out the controller output with 10 amps with a wire resistance of .1 Ohm you would see 13.4 volts at the battery. You need to understand that concept, because it will tell you exactly what is going on. Ideally you want to keep the voltage loss between the controller and battery to 1% or less and at 15 volts is 0.15 volts. That does not happen by accident, it has to be designed.

              So you need to determine what the problem is. Is it meter errors, or voltage loss. Your DMM will tell you.
              Last edited by Sunking; 03-20-2016, 06:22 PM.
              MSEE, PE

              Comment

              • Quozl
                Junior Member
                • Mar 2016
                • 7

                #8
                Thanks... I understand the voltage loss concept... and that clarifies why when not charging the 3 battery meters are almost the same vs while charging the controllers is much different.

                I am still a little confused on the concept of why the controllers battery monitor is showing +1v as opposed to -1v and the direction of voltage loss in this case... I'll try to explain but do not expect it to make sense. I am really just trying to figure out if the controllers battery voltage monitor is working properly. So my understanding based on the 3 displays/readings for real time battery voltage is as follows:

                While No Charge = Battery (DMM 12.80) -> Watt Meter LCD (10cm no big loss 12.79) -> Solar Controller LCD (10m no big loss 12.7)
                Under Charge = Battery (DMM 13.80) -> Watt Meter LCD (10cm no big loss 13.79) -> Solar Controller LCD (10m big loss 14.8*)

                *Hence the confusion of what is the controllers battery meter is displaying here... isn't it meant to display the actual real time battery voltage and not the output voltage. In which case should it not give a readout of 12.8 being -1v less instead of higher 10m away from the battery. But from what I have learnt here so far I am wrong and the LCD on the controller is showing its output voltage or an assumed battery voltage of 14.8 but the loss back down to the terminals is actually only putting 13.8 voltage on the battery.

                I get confused because the controller is making its decisions based on what voltage it thinks the battery is actually at. If it is 10m away with a 1v loss down the line I reason it will be sensing the battery at 1v less than it actually is not 1v higher than it actually is.

                Note I am not confused at all that the direction of loss while charging is at the battery end (Output -> Loss -> Battery)... its just why the battery voltage monitor on the controller reads 1v higher than the actual battery voltage (hence its either faulty or crap under charge or its showing output voltage not an actual reading from the battery end).
                Last edited by Quozl; 03-20-2016, 07:30 PM.

                Comment

                • Quozl
                  Junior Member
                  • Mar 2016
                  • 7

                  #9
                  Attached is a sample LCD display from the manual for the controller. In the attached example while charging the Battery value on the LCD will be say 14.4 but the actual real time battery voltage at the terminals by DMM is 13.4... If its real time battery status then should it not be either accurate or -1v (due to 10m cable loss) not +1v on the the controllers LCD display. Isn't it trying to get a real time battery status from 10m away. Or is it not showing real time battery status at all and really showing its output and assumed battery status (not accounting for loss) while charging. But then that makes no sense as it must be trying to constantly read the actual battery voltage to make any sensible decisions for the output so why is it reading/displaying 1v higher than the battery actually is when its 10m away (or 5m away) from the battery.

                  So I am having to set the controllers absorb and float values 1v higher than reality to have it work like I want it to... if I want absorb mode to cut off at 14.4 then i need to set the controller to 15.4 and if i want it to float at 13.8 then i need to set the controller to 14.8 because its always reading the battery voltage as 1v higher than it actually is at the battery terminals and making its decisions off that 1v higher reading. I would have thought if there is a 1v loss due to the 10m cable the controller would be reading 1v less and I would need to set the controllers absorb and float values to 1v lower not higher. Controller LCD.jpg
                  Last edited by Quozl; 03-20-2016, 08:16 PM.

                  Comment

                  • LETitROLL
                    Solar Fanatic
                    • May 2014
                    • 286

                    #10
                    Originally posted by Quozl
                    isn't it meant to display the actual real time battery voltage and not the output voltage.
                    It looks to me like everything is working okay, Think of it this way, when you hook the meter (controller output) to the battery, you are creating a "circuit" all it is meant to do is show you the voltage potential or voltage present on the "circuit" which as SK explained varies, when charging is active, current is flowing so the voltage drop is real and actual voltage should be lower, as the battery becomes charged its voltage rises until very little current flows (it is basically becoming more equal to the source of charging).It sounds like the controller is showing you the maximum available voltage and of course with your meter(s) taking readings in a live circuit you are going to get actual real time voltages, which are less due to voltage drop and battery S.O.C.
                    Last edited by LETitROLL; 03-20-2016, 08:26 PM.

                    Comment

                    • Sunking
                      Solar Fanatic
                      • Feb 2010
                      • 23301

                      #11
                      When you are talking about voltage drop, sag or loss under load with current flowing, the source will always be higher. In this case the Controller is the source. So if you are loosing 1 volt on the wiring between the Controller and battery you will see something like 14 volts at the controller output, and 13 volts at the battery.

                      t is real easy to determine if your Controller meter is right or not with your DMM. Just measure the voltage at the Output Terminal of the Controller with your DMM. They should be the same no matter is the battery is charging or not.

                      Now having said assuming you are loosing 1 volt under charge means you have a problem with your wiring. It is too small for the length. You should not be seeing more than 0.1 volt difference. You are seeing 10 times that much. A 120 watt panel does not deliver very much current, no more than 9 to 10 amps.
                      Last edited by Sunking; 03-20-2016, 11:23 PM.
                      MSEE, PE

                      Comment

                      • sensij
                        Solar Fanatic
                        • Sep 2014
                        • 5074

                        #12
                        Originally posted by Quozl
                        So I am having to set the controllers absorb and float values 1v higher than reality to have it work like I want it to... if I want absorb mode to cut off at 14.4 then i need to set the controller to 15.4 and if i want it to float at 13.8 then i need to set the controller to 14.8 because its always reading the battery voltage as 1v higher than it actually is at the battery terminals and making its decisions off that 1v higher reading. I would have thought if there is a 1v loss due to the 10m cable the controller would be reading 1v less and I would need to set the controllers absorb and float values to 1v lower not higher.
                        There is another important point you are missing here... the voltage loss is proportional to the current. When in bulk stage, as pictured in the LED, the loss might be 1 V. By the time you get deep into absorb, and definitely in float, the charge current will be much less, and the voltage loss should be much less. It is dangerous to set the charge controller at a higher voltage to compensate for loss, because although it means it will transition from bulk to absorb correctly, by the time absorb is finished you will be hitting the battery with higher voltage than you intend.

                        Replace the 10m wire with something heavier, appropriate for the current.
                        CS6P-260P/SE3000 - http://tiny.cc/ed5ozx

                        Comment

                        • Quozl
                          Junior Member
                          • Mar 2016
                          • 7

                          #13
                          Thanks for the help... In process of moving the controller close to the battery and runt the 5m-10m cables on the power side instead. Hopefully this resolves the issue.

                          Comment

                          • Sunking
                            Solar Fanatic
                            • Feb 2010
                            • 23301

                            #14
                            Originally posted by Quozl
                            Thanks for the help... In process of moving the controller close to the battery and runt the 5m-10m cables on the power side instead. Hopefully this resolves the issue.
                            What size is the wire? One way distance between controller and battery in general is 1 to 2 meters. A 120 watt panel at best is 9 to 10 amps of charge current and at 2 meters one-way only requires 3mm copper cable or #12 AWG copper. At 10 meters requires a 5mm copper or #4 AWG.

                            MSEE, PE

                            Comment

                            • Quozl
                              Junior Member
                              • Mar 2016
                              • 7

                              #15
                              12AWG is what I have at hand for now... If I need larger on the battery side will fix that later. I need 1.2m of that 12AWG from controller to battery. Then 2 sections of 5m 12AWG on the power side allowing for 5m or 10m as required to position the panel. I assume putting the 10m on the power side also has some sort of loss but from what I read this is better as the approx 22v or current coming into the controller needs to be downgraded anyway to 13-14v for the battery. Or is the loss on the power side going to be in current/amps from the panel now (in other words if the panel is capable at best of putting out 9 amps then with 10m of 12AWG to controller should I expect a loss of max possible power from the panel (say from max 9 amps at best to max 7-8 amps just for example)?

                              Comment

                              Working...