I've seen a lot of numbers thrown around on how efficient (or inefficient, as the case may be) flooded lead-acid batteries are as an energy storage device. But never seen anybody provide some actual data from a real-life off-grid system that proves the numbers. Didn't feel like working on the winter's wood supply today. My wife and I took a ride on our Honda Silver Wings for breakfast. Then I came home and was looking for something to do. Ha! Our batteries have been getting charged just about every day in this beautiful fall weather we're having. So how does that affect the cycle efficiency, since we usually run 7-10 day long cycles?
I pulled the cycle history out of our TriMetric battery monitor and recorded it on the chart they provide - enough data to figure out cycle efficiency:
The data is pretty much self-explanatory. This is a full-time off-grid system so it never rests. The batteries are either being charged or discharged 24 hours a day. So the low voltage readings for the cycle are with the system under load. The H4 line, average amps/cycle is the number of amps that were lost for every hour the cycle ran. The TriMetric calculates it from how many MORE amp-hours it took to recharge back to 100% vs what it measured that were taken out during discharge.
This TriMetric monitor is VERY accurate on measuring battery capacity and SOC, properly set up and calibrated. It uses a 500A Deltec shunt on the Main Bus to measure every ah in or out of the battery bank.
The math goes sort of like this (using the X.1 cycle as an example, which is the most recent):
Bank ah capacity is 820 at the 20hr rate and 1156 at the 100hr rate. We use the 100hr rate because it is more representative of our average load on the system. The low SOC for the day can be used to figure out how many amp-hours was used from the fully charged battery. So for the X.1 cycle @ 75% we used 289ah from the battery.
The loss was .39A over 23.5 hours, or 9.17ah more to recharge the battery than what was removed. So being we took 289ah out and it took 298.2 to recharge, our cycle efficiency was 96.9%. We lost 3.1% of the energy we put into the battery in heat.
Now, on the X.5 cycle, which was the longest one, we used 450.8ah from the battery and it required 469.3 for recharge. Cycle efficiency = 96.0%. This was due to the fact that it took longer in absorb stage to get to 100% and slightly more was lost in heat. Experience has shown here that if we cycle from 40-85% for 7 days that the losses are very minimal and we usually get somewhere around .19 - .25A average loss for the cycle.
I have concluded after putting the battery monitors on our bank (I got three more that monitor each string individually) that flooded lead-acid batteries (at least the ones we got) are considerably more efficient as energy storage devices than the 85-90% efficiency numbers that are commonly thrown around. I have seen efficiency numbers in the high 80's when we have run a very shallow cycle to no less than 85% SOC. But they are more typically in the mid-90's if they are cycled deep enough and long enough.
I pulled the cycle history out of our TriMetric battery monitor and recorded it on the chart they provide - enough data to figure out cycle efficiency:
The data is pretty much self-explanatory. This is a full-time off-grid system so it never rests. The batteries are either being charged or discharged 24 hours a day. So the low voltage readings for the cycle are with the system under load. The H4 line, average amps/cycle is the number of amps that were lost for every hour the cycle ran. The TriMetric calculates it from how many MORE amp-hours it took to recharge back to 100% vs what it measured that were taken out during discharge.
This TriMetric monitor is VERY accurate on measuring battery capacity and SOC, properly set up and calibrated. It uses a 500A Deltec shunt on the Main Bus to measure every ah in or out of the battery bank.
The math goes sort of like this (using the X.1 cycle as an example, which is the most recent):
Bank ah capacity is 820 at the 20hr rate and 1156 at the 100hr rate. We use the 100hr rate because it is more representative of our average load on the system. The low SOC for the day can be used to figure out how many amp-hours was used from the fully charged battery. So for the X.1 cycle @ 75% we used 289ah from the battery.
The loss was .39A over 23.5 hours, or 9.17ah more to recharge the battery than what was removed. So being we took 289ah out and it took 298.2 to recharge, our cycle efficiency was 96.9%. We lost 3.1% of the energy we put into the battery in heat.
Now, on the X.5 cycle, which was the longest one, we used 450.8ah from the battery and it required 469.3 for recharge. Cycle efficiency = 96.0%. This was due to the fact that it took longer in absorb stage to get to 100% and slightly more was lost in heat. Experience has shown here that if we cycle from 40-85% for 7 days that the losses are very minimal and we usually get somewhere around .19 - .25A average loss for the cycle.
I have concluded after putting the battery monitors on our bank (I got three more that monitor each string individually) that flooded lead-acid batteries (at least the ones we got) are considerably more efficient as energy storage devices than the 85-90% efficiency numbers that are commonly thrown around. I have seen efficiency numbers in the high 80's when we have run a very shallow cycle to no less than 85% SOC. But they are more typically in the mid-90's if they are cycled deep enough and long enough.
Comment