Announcement

Collapse
No announcement yet.

Purchasing new batteries: How do you determine what one 'cycle' is?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by karrak View Post

    Thought I would do a graph of the daily maximum and minimum SOC of my LFP battery over the past year
    ...
    OK, can you provide more details on how these data were measured? I was under impression you didn't have SOC 'counter' in your system but such graph would require one.

    I also thought, apparently wrong, that you don't discharge your bank deeply at least not as often as these graphs demonstrate.

    Comment


    • #17
      Originally posted by max2k View Post
      OK, can you provide more details on how these data were measured? I was under impression you didn't have SOC 'counter' in your system but such graph would require one.
      Have a look here https://github.com/simat/BatteryMonitor/wiki to see how the data is logged

      Simon

      Off grid 24V system, 6x190W Solar Panels, 32x90ah Winston LiFeYPO4 batteries installed April 2013
      BMS - Homemade Battery logger github.com/simat/BatteryMonitor/wiki
      Latronics 4kW Inverter, homemade MPPT controller


      Off-Grid LFP(LiFePO4) system since April 2013

      Comment


      • #18
        Originally posted by karrak View Post

        Have a look here https://github.com/simat/BatteryMonitor/wiki to see how the data is logged

        Simon

        Off grid 24V system, 6x190W Solar Panels, 32x90ah Winston LiFeYPO4 batteries installed April 2013
        BMS - Homemade Battery logger github.com/simat/BatteryMonitor/wiki
        Latronics 4kW Inverter, homemade MPPT controller

        if I understand correctly your logger takes current readings off the shunt every 10 sec. I haven't checked the python code but I assume you simply multiply current reading by 10s to get Ah for SOC purposes.

        Your calibration procedure looks a little off but probably acceptable depending how close to 0 one runs it: instead of taking 2 pairs of points and then solving linear approximation coefficients you calculate slope using single pair (assuming 0 offset) and then offset using another. Near 0 offset could be comparable to the value and this would produce incorrect coefficients rendering the rest of the data invalid.

        Have you had a chance to measure capacity of your bank in Ah after 4 years of use? I mean logger would record 'relative' Ah but you don't really know what percent of the initial capacity the bank has until you get it through entire voltage swing from low knee to the top and counting Ah along the way.

        On your graph SOC is expressed in % - is it relatively to the initial rated capacity? I can see the max SOC is pegged at 100% and I assume this was done by detecting upper voltage knee, stopping charge at that point and 'resetting' SOC to 100% at that point. If this was not done such systems have tendency to accumulate errors over time as with each cycle some part of energy is lost. Assuming this was done correctly resetting SOC to 100% provides only 1 point of reference- when battery discharges it is impossible to tell % of SOC knowing just Ah passed without knowing the actual capacity at that point so your 60% could mean 95% of the actual remaining capacity. Reaching low voltage knee would allow setting SOC to 0% and total Ah would be equal to the actual remaining capacity at that moment.
        Last edited by max2k; 09-17-2017, 08:58 PM.

        Comment


        • #19
          Originally posted by max2k View Post
          if I understand correctly your logger takes current readings off the shunt every 10 sec. I haven't checked the python code but I assume you simply multiply current reading by 10s to get Ah for SOC purposes.
          No, sample time is 1.002 sec and specified in the config file. I use the measured time between samples to calculate the SOC.

          Your calibration procedure looks a little off but probably acceptable depending how close to 0 one runs it: instead of taking 2 pairs of points and then solving linear approximation coefficients you calculate slope using single pair (assuming 0 offset) and then offset using another. Near 0 offset could be comparable to the value and this would produce incorrect coefficients rendering the rest of the data invalid.
          No, for voltage I have as many calibration points as there are cells in the battery. For current I rely on the accuracy of the shunt and the linearity of the TI 16 bit AtoD for the span accuracy and measure the offset.

          Have you had a chance to measure capacity of your bank in Ah after 4 years of use? I mean logger would record 'relative' Ah but you don't really know what percent of the initial capacity the bank has until you get it through entire voltage swing from low knee to the top and counting Ah along the way.
          I have never measured the capacity of my battery. In the SOC graph I used the manufacturer's rated cell capacity of 90Ah to calculate the SOC. I am more interested in the trend in capacity loss over time rather than absolute figures. Remember my battery is in use 100% of the time. I calculated the SOC of the lowest point on that graph using cell voltage and got a discrepancy of ~3% between the logged SOC reading and the calculated reading using voltage. I am happy with that.

          I can see the max SOC is pegged at 100% and I assume this was done by detecting upper voltage knee, stopping charge at that point and 'resetting' SOC to 100% at that point. If this was not done such systems have tendency to accumulate errors over time as with each cycle some part of energy is lost. Assuming this was done correctly resetting SOC to 100% provides only 1 point of reference- when battery discharges it is impossible to tell % of SOC knowing just Ah passed without knowing the actual capacity at that point so your 60% could mean 95% of the actual remaining capacity. Reaching low voltage knee would allow setting SOC to 0% and total Ah would be equal to the actual remaining capacity at that moment.
          You are right on how I reset my SOC counter. I calculate and take into account the coulomb inefficiency of the battery, I also know within a few % the actual capacity of the battery so how could my SOC reading be so far out to read 95% when it is in fact 60%?

          Simon

          Off grid 24V system, 6x190W Solar Panels, 32x90ah Winston LiFeYPO4 batteries installed April 2013
          BMS - Homemade Battery logger github.com/simat/BatteryMonitor/wiki
          Latronics 4kW Inverter, homemade MPPT controller
          Off-Grid LFP(LiFePO4) system since April 2013

          Comment


          • #20
            Originally posted by karrak View Post
            No, sample time is 1.002 sec and specified in the config file. I use the measured time between samples to calculate the SOC.
            that is odd sample time to have, wonder why not exactly 1s? Actually Linux or Windows cannot reliably measure time interval below 10mS resolution basically your last digit and the one after that are bogus, you could just as happily write there 1.01 or 0.99 with the same relevance to reality. Unless of course you utilized hi resolution timer present in practically all x86 family processors which is simply 64 bit hardware counter with 100nS LSB and measured your time taking readings off that timer. OTOH this timing error would introduce +-1% error to your SOC, which is probably acceptable.

            Originally posted by karrak View Post
            No, for voltage I have as many calibration points as there are cells in the battery. For current I rely on the accuracy of the shunt and the linearity of the TI 16 bit AtoD for the span accuracy and measure the offset.
            I might be splitting hairs here but each of what you're calling 'calibration point' are actually data inputs requiring 2 individual calibration coefficients each- since you're using individual resistor dividers for each of them and unless you used 0.1% resistors and didn't make mistakes in the circuit in terms of accounting for input resistance I'd run calibration for each of them. I'm paranoid that way- more than once such 'obvious' things turned out to be less than obvious and led to discovery of a problem I was not aware of. What I was referring to is having 2 points in each of 16 values (and current) far enough apart (50 - 80% scale) and measure that value using your circuit and some well calibrated DMM to calculate those linear approximation coefficients, the common approach.

            Originally posted by karrak View Post
            I calculated the SOC of the lowest point on that graph using cell voltage and got a discrepancy of ~3% between the logged SOC reading and the calculated reading using voltage. I am happy with that.
            I thought we established the voltage is poor indicator of SOC for LFP?

            Originally posted by karrak View Post
            You are right on how I reset my SOC counter. I calculate and take into account the coulomb inefficiency of the battery, I also know within a few % the actual capacity of the battery so how could my SOC reading be so far out to read 95% when it is in fact 60%?
            From my point of view you don't really know your actual capacity, all you know (minus possible calibration problems) is how much Ah max it can store by taking difference between max SOC and min SOC measured in Ah. These 2 points should belong to the same cycle and not the lowest possible SOC off your graph- your whole system can drift down over multiple cycles so unless you counting Ah in the same cycle I wouldn't assume I'm getting actual capacity looking at the lowest SOC. What I'm saying one day you might see your SOC indicator at 40% and your bottom voltage alarm going off indicating you have only 60% of rated capacity left.

            Comment


            • #21
              Originally posted by max2k View Post
              that is odd sample time to have, wonder why not exactly 1s? Actually Linux or Windows cannot reliably measure time interval below 10mS resolution basically your last digit and the one after that are bogus, you could just as happily write there 1.01 or 0.99 with the same relevance to reality. Unless of course you utilized hi resolution timer present in practically all x86 family processors which is simply 64 bit hardware counter with 100nS LSB and measured your time taking readings off that timer. OTOH this timing error would introduce +-1% error to your SOC, which is probably acceptable.
              Probably unnecessary but the 1.002 sampling time is to counter aliasing with the 100/120 Hz current draw from any mains inverter. I use the Linux time.clock call which gives me current processor time down to 1 uSec.

              I might be splitting hairs here but each of what you're calling 'calibration point' are actually data inputs requiring 2 individual calibration coefficients each- since you're using individual resistor dividers for each of them and unless you used 0.1% resistors and didn't make mistakes in the circuit in terms of accounting for input resistance I'd run calibration for each of them. I'm paranoid that way- more than once such 'obvious' things turned out to be less than obvious and led to discovery of a problem I was not aware of. What I was referring to is having 2 points in each of 16 values (and current) far enough apart (50 - 80% scale) and measure that value using your circuit and some well calibrated DMM to calculate those linear approximation coefficients, the common approach.
              If you read my design notes you will find that I do use 0.1% resistors and use the same divider resistor values for all the individual cell voltage inputs. The cell voltage range of interest is 2.8V-3.6V which corresponds to less than ~22% of the total voltage span. There are no non linear components in the circuitry between the battery and the A to D, just three resistors and a capacitor so any linearity errors will be generated within the A to D. These are well documented in the A to D datasheet.


              I thought we established the voltage is poor indicator of SOC for LFP?
              It is a reasonably good indicator when the SOC is below 8% which is below the bottom knee.

              From my point of view you don't really know your actual capacity, all you know (minus possible calibration problems) is how much Ah max it can store by taking difference between max SOC and min SOC measured in Ah. These 2 points should belong to the same cycle and not the lowest possible SOC off your graph- your whole system can drift down over multiple cycles so unless you counting Ah in the same cycle I wouldn't assume I'm getting actual capacity looking at the lowest SOC. What I'm saying one day you might see your SOC indicator at 40% and your bottom voltage alarm going off indicating you have only 60% of rated capacity left.
              I think your point of view is wrong. I have already said that I compared the lowest SOC measured with my BMS with the SOC calculated using battery voltage and they agree within a few %. On top of this I know that my SOC counter accuracy only drifts by a few %. The reason I know is by looking at the SOC reading each time the SOC counter is reset. If the counter accuracy didn't drift the counter would always read 100% when the counter was reset. A reading other than 100% will be caused by errors in the current measurements and the coulomb inefficiency of the battery. I use this error to dynamically calculate and update an error coefficient that is used to adjust the SOC reading to take into account the coulomb inefficiency of the battery.

              Max, I procrastinated for a while as to whether I should bother answering your post. Maybe I am thin skinned or there are cultural differences between us but I am getting a little tired of what I perceive are the nitpicking and the constant attacks from you. I am happy to answer questions about my setup and my experience and debate the pros and cons of my system. I am not happy when I think you make statements based on incorrect assumptions about my offgrid system and how I operate it and then go on the attack when I point out the errors in your assumptions.

              I don't know your background but you seem to be knowledgeable about measuring systems and other things electronic. I would appreciate it if you would credit me and others with some level of competence until proven otherwise.

              Simon

              Off grid 24V system, 6x190W Solar Panels, 32x90ah Winston LiFeYPO4 batteries installed April 2013
              BMS - Homemade Battery logger github.com/simat/BatteryMonitor/wiki
              Latronics 4kW Inverter, homemade MPPT controller
              Off-Grid LFP(LiFePO4) system since April 2013

              Comment


              • #22
                Originally posted by karrak View Post

                Probably unnecessary but the 1.002 sampling time is to counter aliasing with the 100/120 Hz current draw from any mains inverter. I use the Linux time.clock call which gives me current processor time down to 1 uSec.
                come to think about this I actually missed that: inverter loading battery bank at say 10A current would produce 'rectified' AC wave on the shunt with corresponding amplitude. Of course inverter has some big capacitors to compensate for that so it would probably remind waveform from common rectifier bridge after its capacitor filter. Measuring that at 0.998 Hz sampling rate would transfer that waveform into much lower frequency (around 1/8Hz for 60Hz AC) preserving its amplitude. This would also require the sampling rate to be very stable and as I said Linux or any other non- RT OS is not capable of that. Basically for you it's a dead end if you want to trust your numbers. I hope you connected scope to the shunt and verified the waveform there is only DC or this whole setup would be subject to this effect.

                Classically speaking aliasing can only be removed by analog filter at the ADC input with cutoff frequency at least Fs / 10.This would mean you'd run ADC at say 1000 samples/sec and have LPF with cutoff frequency of 100Hz in front of it. That would produce digitized samples of whatever LFP passed which you'd average over say 1 sec to get average DC value or you could calculate RMS from that if you want to be more accurate.

                Originally posted by karrak View Post
                Max, I procrastinated for a while as to whether I should bother answering your post. Maybe I am thin skinned or there are cultural differences between us but I am getting a little tired of what I perceive are the nitpicking and the constant attacks from you. I am happy to answer questions about my setup and my experience and debate the pros and cons of my system. I am not happy when I think you make statements based on incorrect assumptions about my offgrid system and how I operate it and then go on the attack when I point out the errors in your assumptions.

                I don't know your background but you seem to be knowledgeable about measuring systems and other things electronic. I would appreciate it if you would credit me and others with some level of competence until proven otherwise.
                I'm sorry if I'm coming across as attacking, I never meant that. I simply question methods you used to obtain your results, trying to improve quality of this board. If you outlined those methods in more details I'd have much less questions. I always assume the worst until proven otherwise . You claimed at least 10mV resolution on your graphs and I feel free to challenge that as from my experience that is not easy to get. You'd get 'some' numbers but they won't necessarily mean anything. Then you build some alarm logic around this and things can go wrong.
                Last edited by max2k; 09-22-2017, 05:09 PM.

                Comment

                Working...
                X