Announcement

Collapse
No announcement yet.

Simplified lifepo4 charging and care

Collapse
This is a sticky topic.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    Originally posted by createthis View Post
    The thing I get most confused about with LifePO4 is the lack of charge controllers out there. Floating is generally considered bad with LifePO4, but there are very few charge controllers designed to not float a LifePO4 battery. The dinky little Genasun controllers and the Victron 100/30 100/50 series are the only ones I can think of, and I'm really just guessing because they have LifePO4 charging profiles (I have no idea how those profiles work under the hood). If you use the wrong charge controller, you need to wire a relay between the charger and the battery and shut it down when a certain voltage is reached, or you need to monitor it while charging, which is a bad idea IMO because humans make mistakes.

    I'm not even sure how that would work, because charging to 13.8v you'll hit 13.8v, but the resting voltage will drop you back down to 13.6v or lower, so it's like you need a high voltage cutoff combined with a state machine that only turns the system back on once it has dropped below a certain voltage. If you do that, you lose the ability to power things directly from the solar panels too.
    I am experimenting with a combination of LFP and PbSb batteries to combine the long cycle life of the LFP with the low KWH cost of cheap golf cart Pb and for that combination, I made a quick switcher similar to the above description from parts on hand while I figure out a better approach. The addition is a low voltage disconnect. If using LFP alone, and if the PV controller has remote battery sensing, the over voltage battery disconnect would be replaced by a small relay that adds a voltage to the battery sense terminals and the voltage sensed by my circuit would change from Pb+ in my schematic to Li+ and the low voltage disconnect would not be used. Using a full charge voltage of 3.50V which I experimentally determined is suitable for my batteries and a float of 3.38V, the relay would add .12 V/cell to the sensed voltage and the float and adsorb voltages on the PV controller are set to the high voltage cut. The weak point here is that it switches back to full charging when the battery voltage falls to a preset level. In the final version, I expect to use a SOC indicator which is reset to full when the high voltage disconnect is activated. The cells are balanced by hand, the high voltage cut is kept low, th low voltage disconnect is kept high, and the balance should be checked until balancing is added. Here is my quick circuit .
    batterry-sw.gif
    Last edited by robeyw; 12-11-2019, 11:55 PM. Reason: Added delay switching to bulk charge to supress load surge switching

    Comment


    • #62
      Full charge detection This is a original post in this subject area not a reply, but I can’t find a easy to post anything other than a reply.

      I have seen quite a bit of variation in charging information but it is generally limited to something like charge with limits of 3.65 volts and C amps till the current falls to .05C. Charts show a large CV time. I see recommendations here for lower charging voltages but without any documentation. Some places a float voltage of 3.45V is recommended but other sources recommend 3.40 with the warning that 3.42 can cause overcharging. For normal PV charging of a house battery, I would never exceed .2C and normally not exceed .1C. If excess time above a float voltage is harmful, the transition from charging to float voltage needs to be quick, but what is the charge termination criteria to achieve this? Not seeing any good answers I tried some experimentation with QH Technology prismatic cells. The first observation is that cells that have been recently discharged charge much more easily that those that have not been charged to the state being measured for some time. My fig 2 illustrates this (compare black and green graphs). I looked at criteria for the harder to charge condition and have concluded that for charging at less than .2C there should be no constant voltage phase. To get the rapid voltage fall, I tried to achieve a 99% SOC and assuming a 3.40V float, a charge termination voltage Vterm=3.40+.915I/C seems to work well. (see figs 1,2,3) For the recently charged cell, the SOC will be higher but the greater ease of charging is accompanied by a more rapid voltage fall on charge termination to compensate. Choosing a charge termination at 98% SOC seems practical at the high end of charging current but becomes more uncertain as the current falls. It follows that a CV phase could exist when charging at a current high enough that the Vterm given above exceeds the maximum desired charging voltage. When using a charge controller that requires a CV phase, 1 minute does not seem to be a problem even for the recently discharged condition for I/C=.07 but the transition from float to bulk charge on the basis of cell voltage seems to be a problem due to the very flat V vs SOC and the variations of voltage with charging history in the area that I would want to switch out of float, so I think that a SOC based switch from float to bulk charge is much more trustworthy. The entire regime is easily implemented by adjusting the battery voltage sensed by any PV charger with remote sensing.
      Any comments or corrections?
      fig1lr.jpgfig2lr.jpgfig3lr.jpg

      Comment


      • #63
        Anytime you stray from the mfg's suggestions, your are going to be risking something. Either cell safety, lifetime, any of a plethora of things.

        At this website, we are mostly catering to neophytes and high level discussions about charge regimes is something that will be glossed over and ignored. Or someone with a $4 DVM, with no understanding of what meter calibration is, will follow your instructions and in 6 months, the batteries start failing. Mystery to them, perfectly obvious to any engineer.

        For starters, folks need to insure their battery BMS has:
        Over & Under voltage protection, thermal charge limiting when cold, and out of balance shutdown.

        Sadly, many systems are not ready to integrate with the current solar charge controllers, which require system battery voltage at all times, when solar is present, and the battery BMS will disconnect the battery and leave the Controller unconnected except for 130V of PV , with the expected result of the Controller being fried.
        Powerfab top of pole PV mount (2) | Listeroid 6/1 w/st5 gen head | XW6048 inverter/chgr | Iota 48V/15A charger | Morningstar 60A MPPT | 48V, 800A NiFe Battery (in series)| 15, Evergreen 205w "12V" PV array on pole | Midnight ePanel | Grundfos 10 SO5-9 with 3 wire Franklin Electric motor (1/2hp 240V 1ph ) on a timer for 3 hr noontime run - Runs off PV ||
        || Midnight Classic 200 | 10, Evergreen 200w in a 160VOC array ||
        || VEC1093 12V Charger | Maha C401 aa/aaa Charger | SureSine | Sunsaver MPPT 15A

        solar: http://tinyurl.com/LMR-Solar
        gen: http://tinyurl.com/LMR-Lister

        Comment


        • #64
          Originally posted by robeyw View Post
          .......
          I have seen quite a bit of variation in charging information but it is generally limited to something like charge with limits of 3.65 volts and C amps till the current falls to .05C. Charts show a large CV time. I see recommendations here for lower charging voltages but without any documentation. Some places a float voltage of 3.45V is recommended but other sources recommend 3.40 with the warning that 3.42 can cause overcharging. For normal PV charging of a house battery, I would never exceed .2C and normally not exceed .1C. If excess time above a float voltage is harmful, the transition from charging to float voltage needs to be quick, but what is the charge termination criteria to achieve this?
          Most prismatic Lithium cells are very efficient and robust and I would not worry about charging them at 1C using constant current until they reach 3.4-5 voltage and then switch to constant voltage until the current tapers to .1C (or 0.05 if you prefer) at which point I would terminate the charge. I can't tell from your charts but if you track Whrs you will observe that there is very little capacity gained from most LFP cells beyond 3.5 volts.
          Not seeing any good answers I tried some experimentation with QH Technology prismatic cells. The first observation is that cells that have been recently discharged charge much more easily that those that have not been charged to the state being measured for some time. My fig 2 illustrates this (compare black and green graphs). I looked at criteria for the harder to charge condition and have concluded that for charging at less than .2C there should be no constant voltage phase.
          What you describe as the "harder to charge condition" is simply the physics of the cells. I observe this every day in my EV where it may starting out at 20% SOC charging at 2 to 3 C and as the pack gets past 50% SOC the charge current begins to taper. My EVs have excellent battery management systems they also manage temperature and current so I don't suggest that you can be as aggressive with LFP prismatics.
          To get the rapid voltage fall, I tried to achieve a 99% SOC and assuming a 3.40V float, a charge termination voltage Vterm=3.40+.915I/C seems to work well. (see figs 1,2,3) For the recently charged cell, the SOC will be higher but the greater ease of charging is accompanied by a more rapid voltage fall on charge termination to compensate. Choosing a charge termination at 98% SOC seems practical at the high end of charging current but becomes more uncertain as the current falls. It follows that a CV phase could exist when charging at a current high enough that the Vterm given above exceeds the maximum desired charging voltage. When using a charge controller that requires a CV phase, 1 minute does not seem to be a problem even for the recently discharged condition for I/C=.07 but the transition from float to bulk charge on the basis of cell voltage seems to be a problem due to the very flat V vs SOC and the variations of voltage with charging history in the area that I would want to switch out of float, so I think that a SOC based switch from float to bulk charge is much more trustworthy. The entire regime is easily implemented by adjusting the battery voltage sensed by any PV charger with remote sensing.
          Any comments or corrections?
          I will have to give some thought to what you are describing as "voltage fall". I have not heard that term with respect to Lithium charging strategies. If you are describing what you plotted in Fig 3, that is the normal drop in voltage to the typical resting voltage of the cells of 3.32 volts. I have not observed any change in that resting voltage based on the rate of charging. This is a useful Sticky, have you read the other posts about charging Lithium cells?
          Most people don't even recommend a float stage with Lithium unless you implement it during the day while the cells are discharging and in that case it may be a way to use some of the sun's energy to slow the discharge of the cells. Others have said that the time at a constant float voltage can damage lithium cells, even at a low set point. There is no need for float with Lithiums since they have such a low self discharge rate compared to Lead Acid. I have found it helpful to remember that Bulk is the same as constant current and Absorb is the same as constant voltage.
          You would have to have a very accurate Coulomb counter to rely on SOC as a trustworthy means of switching from Bulk (CC) to Absorb (CV0. I don't understand why you described that transition as "from float to bulk because the physics of the cells actually sees a taper of current when the cells reach the constant voltage set point at which most chargers go to the CV phase. When testing Lithium cells I have used power supplies with variable set points to observe the physics in action. Yes the charge curve is very flat and you need an accurate voltage set point, but if set conservatively after plotting the charge curve or getting one from the manufacturer, you can rely on it more than the drift that I have seen with many Coulomb counters that estimate SOC.
          Last edited by Ampster; 12-18-2019, 12:38 AM.
          9 kW solar. Driving EVs since 2012

          Comment


          • #65
            Originally posted by Ampster View Post
            Most prismatic Lithium cells are very efficient and robust and I would not worry about charging them at 1C using constant current until they reach 3.4-5 voltage and then switch to constant voltage until the current tapers to .1C (or 0.05 if you prefer) at which point I would terminate the charge.
            The entire purpose of my post is to know what to do when charging far below 1C. How do I stop when the current falls to .1C if I am charging at .02C?

            Originally posted by Ampster View Post
            I will have to give some thought to what you are describing as "voltage fall". I have not heard that term with respect to Lithium charging strategies.
            The voltage fall is an indication of how fast the cell can integrate the charge received into a stable form, hence it a slow fall may indicate a undesirably high charge., certainly limiting the SOC to a slightly lower value increases the fall rate which is desirable if the higher voltage is harmful.
            Originally posted by Ampster View Post
            Most people don't even recommend a float stage with Lithium unless you implement it during the day while the cells are discharging and in that case it may be a way to use some of the sun's energy to slow the discharge of the cells. Others have said that the time at a constant float voltage can damage lithium cells, even at a low set point. There is no need for float with Lithiums since they have such a low self discharge rate compared to Lead Acid.
            The sole purpose of the float is to be able to deliver PV power to the load rather than taking it from the battery. It allows small replacement charging, for example a intermittent 20 amp load when 10 amps is available from PV
            I can’t say more about the risk of floating than to say that I see no way harm can occur if the float voltage is no more than the resting open circuit voltage. If 3.4V is the wrong value, what is correct? I can easily charge them to a higher resting voltage but that may be overcharge. I have seen a resting voltage of 3.45 recommended but that does not make it correct.
            Originally posted by Ampster View Post
            You would have to have a very accurate Coulomb counter to rely on SOC as a trustworthy means of switching from Bulk (CC) to Absorb (CV0. I don't understand why you described that transition as "from float to bulk because the physics of the cells actually sees a taper of current when the cells reach the constant voltage set point at which most chargers go to the CV phase.
            Within the low charging rates I discussed there is no CV phase which was the first discovery. The transition from bulk to float is based on voltage. By float to bulk I mean that when the charger is set to the float voltage and the battery has sufficiently discharged (maybe 10%) it should go to max current mode. If the SOC indicator is set to 1 when the transition from max current to float occurs, not much accuracy is needed but drift is still important.

            Originally posted by Ampster View Post
            Yes the charge curve is very flat and you need an accurate voltage set point, but if set conservatively after plotting the charge curve or getting one from the manufacturer, you can rely on it more than the drift that I have seen with many Coulomb counters that estimate SOC.
            Even if it is true that the SOC can be found by measuring the open circuit voltage after it has stood idle for hours, it cannot be done over a large part of the SOC range while it is in active use. fig 2 is a good example. If you want to discuss further in a different thread, just post a new thread with charging in the subject under Lithium-ion. This seemed like the best place for the basic info I found.

            Comment


            • #66
              Originally posted by robeyw View Post
              The entire purpose of my post is to know what to do when charging far below 1C. How do I stop when the current falls to .1C if I am charging at .02C?
              Then stop at .01C. I have never experienced a need to charge at that low rate. I would be more concerned about the time spent with a low C charge unless as you mention below you are also drawing power.
              The voltage fall is an indication of how fast the cell can integrate the charge received into a stable form, hence it a slow fall may indicate a undesirably high charge., certainly limiting the SOC to a slightly lower value increases the fall rate which is desirable if the higher voltage is harmful.
              This is the first time I have heard that issue discussed. I don't think as a concept it is used by any battery management systems that I have seen.
              The sole purpose of the float is to be able to deliver PV power to the load rather than taking it from the battery. It allows small replacement charging, for example a intermittent 20 amp load when 10 amps is available from PV
              I can’t say more about the risk of floating than to say that I see no way harm can occur if the float voltage is no more than the resting open circuit voltage. If 3.4V is the wrong value, what is correct? I can easily charge them to a higher resting voltage but that may be overcharge. I have seen a resting voltage of 3.45 recommended but that does not make it correct.
              Yes that is the only reason I would use float, when there is a load. The question becomes how do you control that to make sure the cells are not spending a lot of time on float if there is no load.
              I agree a resting voltage of 3.4 volts is too high. I have consistently see LFP resting at 3.32 to 3.35.
              Within the low charging rates I discussed there is no CV phase which was the first discovery. The transition from bulk to float is based on voltage. By float to bulk I mean that when the charger is set to the float voltage and the battery has sufficiently discharged (maybe 10%) it should go to max current mode. If the SOC indicator is set to 1 when the transition from max current to float occurs, not much accuracy is needed but drift is still important.
              A while back I used a programmable relay to turn of the charger at a specific Voltage value. It was for an EV I had built and as I tweaked the pack size I didn't want to spend the money on getting the charge algorithm reprogrammed.
              Even if it is true that the SOC can be found by measuring the open circuit voltage after it has stood idle for hours, it cannot be done over a large part of the SOC range while it is in active use. fig 2 is a good example. If you want to discuss further in a different thread, just post a new thread with charging in the subject under Lithium-ion. This seemed like the best place for the basic info I found.
              I think it has been adequately discussed by @PNjunction
              Last edited by Ampster; 12-18-2019, 03:28 PM.
              9 kW solar. Driving EVs since 2012

              Comment


              • #67
                Originally posted by Mike90250 View Post

                Sadly, many systems are not ready to integrate with the current solar charge controllers, which require system battery voltage at all times, when solar is present, and the battery BMS will disconnect the battery and leave the Controller unconnected except for 130V of PV , with the expected result of the Controller being fried.
                My suggestion for ending the charge phase was to raise the voltage on the battery remote sensing terminals then the problem you mention would not occur. A alternate way to do it would be to have two charging paths to the cells. The usual low voltage drop path and a second that need only carry the charging load for a short time but with forward biased diodes in series with the switching FET. Both paths are normally on. To switch the charging off, the low drop path is shut down and the battery terminal voltage rises by the diode drops + the higher resistance FET drop. The charger should respond by ending the charge. If that path is also shut off when no charging is occurring, the discharge path should be active and will supply the battery voltage back to the charger. If charging continues, the BMS has no choice but must shut down that path too but it should only occur if the charger settings are seriously wrong or the charger is malfunctioning.

                Comment


                • #68
                  Heh robeyw, I see where you brain was melting as much as mine was when thinking about solar and LFP. It's freakin complicated due to an unstable input source.

                  Mike90250 brings up a very good issue about the battery disconnecting for whatever reason, be it succeeding at 100% charge, low-voltage disconnect, cell protection disconnects and the like. Not healthy for high-voltage controllers. Low voltage nominal 12v controllers may not mind, but end up being "wired up backwards from a system standpoint" when the problem is fixed. Ie, now you have the panel first, battery last connection problem (which confuses many controllers, or forces them into a safety mode) until you manually connect battery first, and panel last.

                  Thing is, I've never used a bms before, since *I* was the bms. Fun, but not practical nor safe for most.

                  I mentioned this in another thread about the new Trojan Trillium Lifepo4 batts that came out. Then I started reading the manual, and the cutoffs (not only for safety, but also for initial internal balance), may make this kind of solar connection a problem until I find out more.

                  Comment


                  • #69
                    Have enjoyed reading this thread and thought I'd add my own recent experience for consideration. I've posted at the NAWS site and have benefited greatly from thoughtful comment on and off that blog. Here goes:
                    I had a lead acid bank that I stalled a couple weeks too long on checking and when I did plates were exposed, gasp, horrors, truly a bummer but it gave me the freedom to replace with LFP, which I wanted to anyway.
                    I have 3.6kW panels, Morningstar Tristar 60A MPPT, and Magnum 4844 Inverter with remote.
                    I purchased 64 200 Ah CALBs from Simon Lin on Alibaba for a total, shipped of about $8500.
                    I also bought a Choice BMS300 (Chargery BMSPro 16s ($250) from ELectric Car Parts Co and 16 LED balancers, and from them as well ($320) and a couple of contactors for high and low voltage cutoffs.
                    Based on reading and tech articles, I had decided to use 3.45V as Absorb and a very short absorb time. In my ignorance I didn't realize how hard it would be to achieve that V without first stabilizing and balancing the batteries.
                    The batteries arrived on time and pretty well balanced at around 3.29V but if I tried to push the charge above 3.38 or so, some cells would start to increase V rapidly while their neighbors were starting to decrease, like the power was being robbed from the lower one and diverting to the higher one next to it. The LED balancer/equalizer units simply could not keep up with it and I had to halt charging on multiple occasions to keep the V below 3.5.
                    I took a step back and reduced the charge current to around 20A max (0.025C or so) and would spend several days at each new slightly higher V to allow the bank to stabilize. The incremental approach, in combination with the LED balancers allowed me to finally achieve my revised absorb target of 3.43V for 6 minutes.
                    to 54.1
                    I struggled to determine the float setting after realizing these things don't behave like FLA at all. I have ultimately concluded that 54V is a good float V and would like others' feedback on that. I narrowed in on that as the highest "resting" V that the batteries could sit at without actually removing lots of power from them during the transition from absorb. I hfirst used a lower float V but the system would just run on the batteries for at least an hour while dropping to float. At 54 or 54.1V, when the charging current is removed, the bank falls reasonably quickly to 54 or 54.1 V and only about 3A is required to maintain that V. Small loads are coming off and on line all the time so I hope it is not damaging to hold them at float V.
                    I since ordered two 3.65, 20A chargers and in the near future I plan to put all cells into parallel and do the top-end charging I probably should have done to begin with.
                    A couple other observations: the LED balancers seem to work pretty well although not terribly consistent in that sometimes cells that are just a bit low are charging more than others that are lower (you can assess the charge intensity by the brightness of the LED). But, overall, I think they work especially when charge/discharge rates are low.
                    The Choice BMS has been a major disappointment though. The voltages it reports are consistently low (10-25mV) on Cell#1 and High (10-20mV) on Cell#16 giving the overall bank a deviation 30-40mV greater than actual. At first I had the BMS balancing function turned on until I concluded that it was actually trying to charge or discharge cells that were right on target. I think that is the result of crappy V readings. I contacted Jason Wang (Chargery developer I think) and he informed me that the biased readings on those two cells were what I should expect. I didn't understand the logic but it doesn't make any sense to me. The unit is still useful for setting cell-based high and low voltage cutoffs and for seat-of-the-pants, big picture glancing to see what is going on which the handy display provides. The current is also inaccurate and bounces around. After several tries at calibration, I have not really improved it. Without accurate current readings, SOC is also bogus.
                    That's my experience to date. I hope it helps someone and if somebody believes what I am doing is wrongheaded or damaging to the cells, please let me know. There is no recipe for all this because there are simply too many variables but I greatly appreciate the help I have received so far and hope my experience can help others.
                    Last edited by dharry; 02-04-2020, 10:55 PM.

                    Comment


                    • #70
                      Originally posted by dharry View Post
                      Have enjoyed reading this thread and thought I'd add my own recent experience for consideration. I've posted at the NAWS site and have benefited greatly from thoughtful comment on and off that blog. Here goes:
                      I had a lead acid bank that I stalled a couple weeks too long on checking and when I did plates were exposed, gasp, horrors, truly a bummer but it gave me the freedom to replace with LFP, which I wanted to anyway.
                      I have 3.6kW panels, Morningstar Tristar 60A MPPT, and Magnum 4844 Inverter with remote.
                      I purchased 64 200 Ah CALBs from Simon Lin on Alibaba for a total, shipped of about $8500.
                      I also bought a Choice BMS300 (Chargery BMSPro 16s ($250) from ELectric Car Parts Co and 16 LED balancers, and from them as well ($320) and a couple of contactors for high and low voltage cutoffs.
                      Based on reading and tech articles, I had decided to use 3.45V as Absorb and a very short absorb time. In my ignorance I didn't realize how hard it would be to achieve that V without first stabilizing and balancing the batteries.
                      The batteries arrived on time and pretty well balanced at around 3.29V but if I tried to push the charge above 3.38 or so, some cells would start to increase V rapidly while their neighbors were starting to decrease, like the power was being robbed from the lower one and diverting to the higher one next to it. The LED balancer/equalizer units simply could not keep up with it and I had to halt charging on multiple occasions to keep the V below 3.5.
                      I took a step back and reduced the charge current to around 20A max (0.025C or so) and would spend several days at each new slightly higher V to allow the bank to stabilize. The incremental approach, in combination with the LED balancers allowed me to finally achieve my revised absorb target of 3.43V for 6 minutes.
                      to 54.1
                      I struggled to determine the float setting after realizing these things don't behave like FLA at all. I have ultimately concluded that 54V is a good float V and would like others' feedback on that. I narrowed in on that as the highest "resting" V that the batteries could sit at without actually removing lots of power from them during the transition from absorb. I hfirst used a lower float V but the system would just run on the batteries for at least an hour while dropping to float. At 54 or 54.1V, when the charging current is removed, the bank falls reasonably quickly to 54 or 54.1 V and only about 3A is required to maintain that V. Small loads are coming off and on line all the time so I hope it is not damaging to hold them at float V.
                      I since ordered two 3.65, 20A chargers and in the near future I plan to put all cells into parallel and do the top-end charging I probably should have done to begin with.
                      A couple other observations: the LED balancers seem to work pretty well although not terribly consistent in that sometimes cells that are just a bit low are charging more than others that are lower (you can assess the charge intensity by the brightness of the LED). But, overall, I think they work especially when charge/discharge rates are low.
                      The Choice BMS has been a major disappointment though. The voltages it reports are consistently low (10-25mV) on Cell#1 and High (10-20mV) on Cell#16 giving the overall bank a deviation 30-40mV greater than actual. At first I had the BMS balancing function turned on until I concluded that it was actually trying to charge or discharge cells that were right on target. I think that is the result of crappy V readings. I contacted Jason Wang (Chargery developer I think) and he informed me that the biased readings on those two cells were what I should expect. I didn't understand the logic but it doesn't make any sense to me. The unit is still useful for setting cell-based high and low voltage cutoffs and for seat-of-the-pants, big picture glancing to see what is going on which the handy display provides. The current is also inaccurate and bounces around. After several tries at calibration, I have not really improved it. Without accurate current readings, SOC is also bogus.
                      That's my experience to date. I hope it helps someone and if somebody believes what I am doing is wrongheaded or damaging to the cells, please let me know. There is no recipe for all this because there are simply too many variables but I greatly appreciate the help I have received so far and hope my experience can help others.
                      I don't know what you are using for a BMS but I spent some good money on one that I became aware of when I was converting a VW to an EV. I don't use it much for balancing much but I rely on it for reporting. It is really worth the $500 I paid for it and has worked for LFP cells as well as NMC that I am currently using. I did parallel my cells for a few days but did not charge them so I guess you could say they are middle balanced. I am using Nissan Leaf cells now (NMC) but used LFPs for years and I only charged them to 3.45 and then after a few hours the resting voltage was 3.32-35.

                      There are some good stickies about charging Lithium cells and generally Float is not recommended but if there is always a current draw I think that may be okay. If the 54.1 volts is 16 cells that is 3.38 per cell which is significantly below the top of 3.65 so you should be okay.

                      You are correct to watch each cell as the pack gets full. As your cells reach capacity their ability to take current declines. Because of subtle differences in capacity some will spike in voltage before others and that is when you want to have very accurate and consistant voltage measurments between cells. Lower charging currents at the top reduce that phenomena.

                      @PNJunction has provided a lot of useful information on this thread. A lot of what I have said has come from my experience enhanced by what he has added to this thread.
                      Last edited by Ampster; 02-05-2020, 01:31 AM.
                      9 kW solar. Driving EVs since 2012

                      Comment


                      • #71
                        Originally posted by dharry View Post
                        I struggled to determine the float setting after realizing these things don't behave like FLA at all. I have ultimately concluded that 54V is a good float V and would like others' feedback on that. I narrowed in on that as the highest "resting" V that the batteries could sit at without actually removing lots of power from them during the transition from absorb. I hfirst used a lower float V but the system would just run on the batteries for at least an hour while dropping to float. At 54 or 54.1V, when the charging current is removed, the bank falls reasonably quickly to 54 or 54.1 V and only about 3A is required to maintain that V. Small loads are coming off and on line all the time so I hope it is not damaging to hold them at float V.
                        I think 3.375 is a safe float voltage. To see if it is high enough, connect a load for a few minutes with charging off noting the AH drawn then allow charging and see if it recovers fast enough. If it is too slow, raise the float voltage a bit, hopefully not going over 3.4V, a voltage I have seen recommended numerous times. The continuous balancers are another matter. I have not used them but the spec is for up to a 10mv voltage error and a effective resistance of .3 ohm so they could cause a balance error of 33ma continuously which is.8 AH/day. I would want a balancer that I can enable when any cell is above some threshold such as 3.45 in your case and off otherwise. A posibility is a flying capacitor balancer I see on eBay with a heading of "New Version 5A Balancer 4 LTO LiFePo4 Li-ion Battery Active Equalizer Balancer" I have the 8 cell version 1.1 and it has an enable (but no instructions) I don't know if the 16 cell version has the enable but for mine, there are 2 pads, one is connected to B+ and the other has the logic and is pulled down to B- with a 50K resistor. To enable it with logic, I would connect a opto-isolator between the terminals and drive it with enough current for the isolator to sink atleast .5 ma (actual current is .2 ma so .5 ma gives a good safety margin). If you cant use the 16 cell version, 3 6 to 7 cell units which do have the enable could be combined. The effective balancing resistance of a single unit is .1 ohm. Prices are 15-16 cell $53.85, 7-8 cell $34.85, 5-6 cell $27.77
                        Last edited by robeyw; 02-05-2020, 02:14 PM.

                        Comment


                        • #72
                          Originally posted by dharry View Post
                          I since ordered two 3.65, 20A chargers and in the near future I plan to put all cells into parallel and do the top-end charging I probably should have done to begin with.
                          That gives a maximum charge rate of .0015C while it is recommended to discontinue charging at 3.65V at about .05C which is why there is a discussion about floating. If you use a absorb voltage of 3.43, there should be no advantage to equalizing at a higher voltage (because your equalizers can charge and discharge cells).
                          Last edited by robeyw; 02-05-2020, 02:26 PM.

                          Comment


                          • #73
                            Originally posted by robeyw View Post
                            That gives a maximum charge rate of .0015C while it is recommended to discontinue charging at 3.65V at about .05C which is why there is a discussion about floating. If you use a absorb voltage of 3.43, there should be no advantage to equalizing at a higher voltage (because your equalizers can charge and discharge cells).
                            I think the OP is talking about the whole pack in parallel to get the cells closely balanced near the top. In that configuration they will self balance the voltage themselves. Earlier I think he said high voltage cutoff would be about 3.4 per cell in normal use when using the balancing devices.
                            Last edited by Ampster; 02-05-2020, 09:21 PM.
                            9 kW solar. Driving EVs since 2012

                            Comment


                            • #74
                              Originally posted by Ampster View Post

                              I think the OP is talking about the whole pack in parallel to get the cells closely balanced near the top. In that configuration they will self balance the voltage themselves. Earlier I think he said high voltage cutoff would be about 3.4 per cell in normal use when using the balancing devices.
                              Yes the voltages will be equal when all are in parallel but at 3.65 at no more than .0015C. It would be better done at 3.4V but now the rough equalization has been done there is no reason to go back to this step. Since his balancers can charge and discharge cells, equalization can be done at the absorb voltage while in normal use.
                              Last edited by robeyw; 02-06-2020, 09:41 AM.

                              Comment


                              • #75
                                Originally posted by robeyw View Post
                                Yes the voltages will be equal when all are in parallel but at 3.65 at no more than .0015C. It would be better done at 3.4V but now the rough equalization has been done there is no reason to go back to this step. Since his balancers can charge and discharge cells, equalization can be done at the absorb voltage while in normal use.
                                I understood the OP to say that the voltage measurement of his cell balancing devices was inaccurate so he suggested that he wanted to do a one time "top balancing" at 3.65 volts. That is the usual procedure for people who subscribe to the top balancing theory. I do not want to engage in a top balance versus bottom balance debate, but I do not think it will hurt the cells to be charged to 3.65 as long as they aren't kept at that voltage for a long time. 3.4 volts in my mind would work just as well. The important thing is to leave them in parallel for a day or two either before or after charging. They should settle at a resting voltage around 3.3 volts regardless of what the charging voltage ends up being.
                                9 kW solar. Driving EVs since 2012

                                Comment

                                Working...
                                X