Announcement

Collapse
No announcement yet.

Simplified lifepo4 charging and care

Collapse
This is a sticky topic.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    Originally posted by createthis View Post
    The thing I get most confused about with LifePO4 is the lack of charge controllers out there. Floating is generally considered bad with LifePO4, but there are very few charge controllers designed to not float a LifePO4 battery. The dinky little Genasun controllers and the Victron 100/30 100/50 series are the only ones I can think of, and I'm really just guessing because they have LifePO4 charging profiles (I have no idea how those profiles work under the hood). If you use the wrong charge controller, you need to wire a relay between the charger and the battery and shut it down when a certain voltage is reached, or you need to monitor it while charging, which is a bad idea IMO because humans make mistakes.

    I'm not even sure how that would work, because charging to 13.8v you'll hit 13.8v, but the resting voltage will drop you back down to 13.6v or lower, so it's like you need a high voltage cutoff combined with a state machine that only turns the system back on once it has dropped below a certain voltage. If you do that, you lose the ability to power things directly from the solar panels too.
    I am experimenting with a combination of LFP and PbSb batteries to combine the long cycle life of the LFP with the low KWH cost of cheap golf cart Pb and for that combination, I made a quick switcher similar to the above description from parts on hand while I figure out a better approach. The addition is a low voltage disconnect. If using LFP alone, and if the PV controller has remote battery sensing, the over voltage battery disconnect would be replaced by a small relay that adds a voltage to the battery sense terminals and the voltage sensed by my circuit would change from Pb+ in my schematic to Li+ and the low voltage disconnect would not be used. Using a full charge voltage of 3.50V which I experimentally determined is suitable for my batteries and a float of 3.38V, the relay would add .12 V/cell to the sensed voltage and the float and adsorb voltages on the PV controller are set to the high voltage cut. The weak point here is that it switches back to full charging when the battery voltage falls to a preset level. In the final version, I expect to use a SOC indicator which is reset to full when the high voltage disconnect is activated. The cells are balanced by hand, the high voltage cut is kept low, th low voltage disconnect is kept high, and the balance should be checked until balancing is added. Here is my quick circuit .
    batterry-sw.gif
    Last edited by robeyw; 12-11-2019, 11:55 PM. Reason: Added delay switching to bulk charge to supress load surge switching

    Comment


    • #62
      Full charge detection This is a original post in this subject area not a reply, but I can’t find a easy to post anything other than a reply.

      I have seen quite a bit of variation in charging information but it is generally limited to something like charge with limits of 3.65 volts and C amps till the current falls to .05C. Charts show a large CV time. I see recommendations here for lower charging voltages but without any documentation. Some places a float voltage of 3.45V is recommended but other sources recommend 3.40 with the warning that 3.42 can cause overcharging. For normal PV charging of a house battery, I would never exceed .2C and normally not exceed .1C. If excess time above a float voltage is harmful, the transition from charging to float voltage needs to be quick, but what is the charge termination criteria to achieve this? Not seeing any good answers I tried some experimentation with QH Technology prismatic cells. The first observation is that cells that have been recently discharged charge much more easily that those that have not been charged to the state being measured for some time. My fig 2 illustrates this (compare black and green graphs). I looked at criteria for the harder to charge condition and have concluded that for charging at less than .2C there should be no constant voltage phase. To get the rapid voltage fall, I tried to achieve a 99% SOC and assuming a 3.40V float, a charge termination voltage Vterm=3.40+.915I/C seems to work well. (see figs 1,2,3) For the recently charged cell, the SOC will be higher but the greater ease of charging is accompanied by a more rapid voltage fall on charge termination to compensate. Choosing a charge termination at 98% SOC seems practical at the high end of charging current but becomes more uncertain as the current falls. It follows that a CV phase could exist when charging at a current high enough that the Vterm given above exceeds the maximum desired charging voltage. When using a charge controller that requires a CV phase, 1 minute does not seem to be a problem even for the recently discharged condition for I/C=.07 but the transition from float to bulk charge on the basis of cell voltage seems to be a problem due to the very flat V vs SOC and the variations of voltage with charging history in the area that I would want to switch out of float, so I think that a SOC based switch from float to bulk charge is much more trustworthy. The entire regime is easily implemented by adjusting the battery voltage sensed by any PV charger with remote sensing.
      Any comments or corrections?
      fig1lr.jpgfig2lr.jpgfig3lr.jpg

      Comment


      • #63
        Anytime you stray from the mfg's suggestions, your are going to be risking something. Either cell safety, lifetime, any of a plethora of things.

        At this website, we are mostly catering to neophytes and high level discussions about charge regimes is something that will be glossed over and ignored. Or someone with a $4 DVM, with no understanding of what meter calibration is, will follow your instructions and in 6 months, the batteries start failing. Mystery to them, perfectly obvious to any engineer.

        For starters, folks need to insure their battery BMS has:
        Over & Under voltage protection, thermal charge limiting when cold, and out of balance shutdown.

        Sadly, many systems are not ready to integrate with the current solar charge controllers, which require system battery voltage at all times, when solar is present, and the battery BMS will disconnect the battery and leave the Controller unconnected except for 130V of PV , with the expected result of the Controller being fried.
        Powerfab top of pole PV mount (2) | Listeroid 6/1 w/st5 gen head | XW6048 inverter/chgr | Iota 48V/15A charger | Morningstar 60A MPPT | 48V, 800A NiFe Battery (in series)| 15, Evergreen 205w "12V" PV array on pole | Midnight ePanel | Grundfos 10 SO5-9 with 3 wire Franklin Electric motor (1/2hp 240V 1ph ) on a timer for 3 hr noontime run - Runs off PV ||
        || Midnight Classic 200 | 10, Evergreen 200w in a 160VOC array ||
        || VEC1093 12V Charger | Maha C401 aa/aaa Charger | SureSine | Sunsaver MPPT 15A

        solar: http://tinyurl.com/LMR-Solar
        gen: http://tinyurl.com/LMR-Lister

        Comment


        • #64
          Originally posted by robeyw View Post
          .......
          I have seen quite a bit of variation in charging information but it is generally limited to something like charge with limits of 3.65 volts and C amps till the current falls to .05C. Charts show a large CV time. I see recommendations here for lower charging voltages but without any documentation. Some places a float voltage of 3.45V is recommended but other sources recommend 3.40 with the warning that 3.42 can cause overcharging. For normal PV charging of a house battery, I would never exceed .2C and normally not exceed .1C. If excess time above a float voltage is harmful, the transition from charging to float voltage needs to be quick, but what is the charge termination criteria to achieve this?
          Most prismatic Lithium cells are very efficient and robust and I would not worry about charging them at 1C using constant current until they reach 3.4-5 voltage and then switch to constant voltage until the current tapers to .1C (or 0.05 if you prefer) at which point I would terminate the charge. I can't tell from your charts but if you track Whrs you will observe that there is very little capacity gained from most LFP cells beyond 3.5 volts.
          Not seeing any good answers I tried some experimentation with QH Technology prismatic cells. The first observation is that cells that have been recently discharged charge much more easily that those that have not been charged to the state being measured for some time. My fig 2 illustrates this (compare black and green graphs). I looked at criteria for the harder to charge condition and have concluded that for charging at less than .2C there should be no constant voltage phase.
          What you describe as the "harder to charge condition" is simply the physics of the cells. I observe this every day in my EV where it may starting out at 20% SOC charging at 2 to 3 C and as the pack gets past 50% SOC the charge current begins to taper. My EVs have excellent battery management systems they also manage temperature and current so I don't suggest that you can be as aggressive with LFP prismatics.
          To get the rapid voltage fall, I tried to achieve a 99% SOC and assuming a 3.40V float, a charge termination voltage Vterm=3.40+.915I/C seems to work well. (see figs 1,2,3) For the recently charged cell, the SOC will be higher but the greater ease of charging is accompanied by a more rapid voltage fall on charge termination to compensate. Choosing a charge termination at 98% SOC seems practical at the high end of charging current but becomes more uncertain as the current falls. It follows that a CV phase could exist when charging at a current high enough that the Vterm given above exceeds the maximum desired charging voltage. When using a charge controller that requires a CV phase, 1 minute does not seem to be a problem even for the recently discharged condition for I/C=.07 but the transition from float to bulk charge on the basis of cell voltage seems to be a problem due to the very flat V vs SOC and the variations of voltage with charging history in the area that I would want to switch out of float, so I think that a SOC based switch from float to bulk charge is much more trustworthy. The entire regime is easily implemented by adjusting the battery voltage sensed by any PV charger with remote sensing.
          Any comments or corrections?
          I will have to give some thought to what you are describing as "voltage fall". I have not heard that term with respect to Lithium charging strategies. If you are describing what you plotted in Fig 3, that is the normal drop in voltage to the typical resting voltage of the cells of 3.32 volts. I have not observed any change in that resting voltage based on the rate of charging. This is a useful Sticky, have you read the other posts about charging Lithium cells?
          Most people don't even recommend a float stage with Lithium unless you implement it during the day while the cells are discharging and in that case it may be a way to use some of the sun's energy to slow the discharge of the cells. Others have said that the time at a constant float voltage can damage lithium cells, even at a low set point. There is no need for float with Lithiums since they have such a low self discharge rate compared to Lead Acid. I have found it helpful to remember that Bulk is the same as constant current and Absorb is the same as constant voltage.
          You would have to have a very accurate Coulomb counter to rely on SOC as a trustworthy means of switching from Bulk (CC) to Absorb (CV0. I don't understand why you described that transition as "from float to bulk because the physics of the cells actually sees a taper of current when the cells reach the constant voltage set point at which most chargers go to the CV phase. When testing Lithium cells I have used power supplies with variable set points to observe the physics in action. Yes the charge curve is very flat and you need an accurate voltage set point, but if set conservatively after plotting the charge curve or getting one from the manufacturer, you can rely on it more than the drift that I have seen with many Coulomb counters that estimate SOC.
          Last edited by Ampster; 12-18-2019, 12:38 AM.

          Comment


          • #65
            Originally posted by Ampster View Post
            Most prismatic Lithium cells are very efficient and robust and I would not worry about charging them at 1C using constant current until they reach 3.4-5 voltage and then switch to constant voltage until the current tapers to .1C (or 0.05 if you prefer) at which point I would terminate the charge.
            The entire purpose of my post is to know what to do when charging far below 1C. How do I stop when the current falls to .1C if I am charging at .02C?

            Originally posted by Ampster View Post
            I will have to give some thought to what you are describing as "voltage fall". I have not heard that term with respect to Lithium charging strategies.
            The voltage fall is an indication of how fast the cell can integrate the charge received into a stable form, hence it a slow fall may indicate a undesirably high charge., certainly limiting the SOC to a slightly lower value increases the fall rate which is desirable if the higher voltage is harmful.
            Originally posted by Ampster View Post
            Most people don't even recommend a float stage with Lithium unless you implement it during the day while the cells are discharging and in that case it may be a way to use some of the sun's energy to slow the discharge of the cells. Others have said that the time at a constant float voltage can damage lithium cells, even at a low set point. There is no need for float with Lithiums since they have such a low self discharge rate compared to Lead Acid.
            The sole purpose of the float is to be able to deliver PV power to the load rather than taking it from the battery. It allows small replacement charging, for example a intermittent 20 amp load when 10 amps is available from PV
            I can’t say more about the risk of floating than to say that I see no way harm can occur if the float voltage is no more than the resting open circuit voltage. If 3.4V is the wrong value, what is correct? I can easily charge them to a higher resting voltage but that may be overcharge. I have seen a resting voltage of 3.45 recommended but that does not make it correct.
            Originally posted by Ampster View Post
            You would have to have a very accurate Coulomb counter to rely on SOC as a trustworthy means of switching from Bulk (CC) to Absorb (CV0. I don't understand why you described that transition as "from float to bulk because the physics of the cells actually sees a taper of current when the cells reach the constant voltage set point at which most chargers go to the CV phase.
            Within the low charging rates I discussed there is no CV phase which was the first discovery. The transition from bulk to float is based on voltage. By float to bulk I mean that when the charger is set to the float voltage and the battery has sufficiently discharged (maybe 10%) it should go to max current mode. If the SOC indicator is set to 1 when the transition from max current to float occurs, not much accuracy is needed but drift is still important.

            Originally posted by Ampster View Post
            Yes the charge curve is very flat and you need an accurate voltage set point, but if set conservatively after plotting the charge curve or getting one from the manufacturer, you can rely on it more than the drift that I have seen with many Coulomb counters that estimate SOC.
            Even if it is true that the SOC can be found by measuring the open circuit voltage after it has stood idle for hours, it cannot be done over a large part of the SOC range while it is in active use. fig 2 is a good example. If you want to discuss further in a different thread, just post a new thread with charging in the subject under Lithium-ion. This seemed like the best place for the basic info I found.

            Comment


            • #66
              Originally posted by robeyw View Post
              The entire purpose of my post is to know what to do when charging far below 1C. How do I stop when the current falls to .1C if I am charging at .02C?
              Then stop at .01C. I have never experienced a need to charge at that low rate. I would be more concerned about the time spent with a low C charge unless as you mention below you are also drawing power.
              The voltage fall is an indication of how fast the cell can integrate the charge received into a stable form, hence it a slow fall may indicate a undesirably high charge., certainly limiting the SOC to a slightly lower value increases the fall rate which is desirable if the higher voltage is harmful.
              This is the first time I have heard that issue discussed. I don't think as a concept it is used by any battery management systems that I have seen.
              The sole purpose of the float is to be able to deliver PV power to the load rather than taking it from the battery. It allows small replacement charging, for example a intermittent 20 amp load when 10 amps is available from PV
              I can’t say more about the risk of floating than to say that I see no way harm can occur if the float voltage is no more than the resting open circuit voltage. If 3.4V is the wrong value, what is correct? I can easily charge them to a higher resting voltage but that may be overcharge. I have seen a resting voltage of 3.45 recommended but that does not make it correct.
              Yes that is the only reason I would use float, when there is a load. The question becomes how do you control that to make sure the cells are not spending a lot of time on float if there is no load.
              I agree a resting voltage of 3.4 volts is too high. I have consistently see LFP resting at 3.32 to 3.35.
              Within the low charging rates I discussed there is no CV phase which was the first discovery. The transition from bulk to float is based on voltage. By float to bulk I mean that when the charger is set to the float voltage and the battery has sufficiently discharged (maybe 10%) it should go to max current mode. If the SOC indicator is set to 1 when the transition from max current to float occurs, not much accuracy is needed but drift is still important.
              A while back I used a programmable relay to turn of the charger at a specific Voltage value. It was for an EV I had built and as I tweaked the pack size I didn't want to spend the money on getting the charge algorithm reprogrammed.
              Even if it is true that the SOC can be found by measuring the open circuit voltage after it has stood idle for hours, it cannot be done over a large part of the SOC range while it is in active use. fig 2 is a good example. If you want to discuss further in a different thread, just post a new thread with charging in the subject under Lithium-ion. This seemed like the best place for the basic info I found.
              I think it has been adequately discussed by @PNjunction
              Last edited by Ampster; 12-18-2019, 03:28 PM.

              Comment


              • #67
                Originally posted by Mike90250 View Post

                Sadly, many systems are not ready to integrate with the current solar charge controllers, which require system battery voltage at all times, when solar is present, and the battery BMS will disconnect the battery and leave the Controller unconnected except for 130V of PV , with the expected result of the Controller being fried.
                My suggestion for ending the charge phase was to raise the voltage on the battery remote sensing terminals then the problem you mention would not occur. A alternate way to do it would be to have two charging paths to the cells. The usual low voltage drop path and a second that need only carry the charging load for a short time but with forward biased diodes in series with the switching FET. Both paths are normally on. To switch the charging off, the low drop path is shut down and the battery terminal voltage rises by the diode drops + the higher resistance FET drop. The charger should respond by ending the charge. If that path is also shut off when no charging is occurring, the discharge path should be active and will supply the battery voltage back to the charger. If charging continues, the BMS has no choice but must shut down that path too but it should only occur if the charger settings are seriously wrong or the charger is malfunctioning.

                Comment


                • #68
                  Heh robeyw, I see where you brain was melting as much as mine was when thinking about solar and LFP. It's freakin complicated due to an unstable input source.

                  Mike90250 brings up a very good issue about the battery disconnecting for whatever reason, be it succeeding at 100% charge, low-voltage disconnect, cell protection disconnects and the like. Not healthy for high-voltage controllers. Low voltage nominal 12v controllers may not mind, but end up being "wired up backwards from a system standpoint" when the problem is fixed. Ie, now you have the panel first, battery last connection problem (which confuses many controllers, or forces them into a safety mode) until you manually connect battery first, and panel last.

                  Thing is, I've never used a bms before, since *I* was the bms. Fun, but not practical nor safe for most.

                  I mentioned this in another thread about the new Trojan Trillium Lifepo4 batts that came out. Then I started reading the manual, and the cutoffs (not only for safety, but also for initial internal balance), may make this kind of solar connection a problem until I find out more.

                  Comment

                  Working...
                  X