AGM or GEL Batteries? Which is best?

Collapse
X
 
  • Time
  • Show
Clear All
new posts

  • Arlou52
    replied
    Originally posted by Sunking
    General rule of thumb.Batteries are best charged at C/10 rate so C/10 rate on a 300 AH battery rate is 30 amps. 30 amps x 50 volts (48 volt battery) = 1500 watts.
    Well if using a 48 volt battery requires 1500 watts of solar (30 amps x 50 volts = 1500 watts,) then does 30 amps x 12 volts = just 360 watts of solar? The math seems like it's better to use a 12 volt battery and just have 4, 100 watt panels.
    Last edited by Arlou52; 05-25-2020, 03:10 PM.

    Leave a comment:


  • Sunking
    replied
    Originally posted by SageOldmann
    I am puzzled by your use of "5" days in your formula. Why not 1 day, or 7 days? Is the 5 days based on averaging with little charging of the batteries on cloudy days?
    Few factors involved, but mostly economics and practicality. Battery Cycle life depends on depth of discharge. The deeper you discharge than, the fewer cycles you can get out of them. Example a mid tier Trojan Battery at 20% Depth of Discharge will yeild you around 3000 cycle while at 50% per day down to less than 1000 cycles. At 20% DOD per gives you the most bang for your battery dollar. The otther sid eof th ecoin is how many days per year do you want to be without power. You run 50% DOD and just one cloudy day you go dark next day without sun, plus another day to recharge without a genny. So you got two choices sit in th edark, or spend money on genny fuel.

    Originally posted by SageOldmann
    You also said
    “5 amps x 120 volts x 5 hours = 3000 watt hours per day and that is a huge number. You are no longer looking at a 12 volt system. with a 15,000 watt hour battery.”
    How many Ah would a 15,000 watt battery be in a 12 volt system?
    5th grade math in in China, or 16-year Graduate level USA math. Battery AH = Watt Hours / Battery Voltage


    Originally posted by SageOldmann
    How did you get to the Minimum Panel Wattage = 1500 watts?
    General rule of thumb.Batteries are best charged at C/10 rate so C/10 rate on a 300 AH battery rate is 30 amps. 30 amps x 50 volts (48 volt battery) = 1500 watts.

    It is all 5th Grade math, or Ohms Law. Another math problem you might explain to the wifey is Energy Return On Investment. All that means if I put in 1 unit of energy in how much do I get out of it. Anything you take off grid is less than 1. That means there are 3 things no one wants you to know about, especially Green Mafia.

    1. Off-Grid Battery Power is going to cost you 3 to 10 times more than buying it from the Power Company the rest of your life.
    2. You are a heavy polluter.
    3. Wasting natural resources and energy robing your children blind.
    3.Ohms-Law-Pie-Chart.jpg
    Last edited by Sunking; 05-25-2020, 11:15 AM.

    Leave a comment:


  • Ampster
    replied
    Originally posted by Sunking

    Who gives a crap AMPSTER? It is a Straw-Man Argument,....
    I will be happy to make corrections to any factual errors.

    Leave a comment:


  • SunEagle
    replied
    Originally posted by SageOldmann
    I like the way you and Sunking think. I average 4.7 hours of sunlight for 202 days per year with 101 days of full sun and 101 days of partly sunny, so using the 5 day multiplier is just about right for me. So what do you think. Can I Make a 500 Ah battery bank work or should I re-think it all?
    A 500Ah system should provide a significant amount of power. If it will work will depend on if your estimation on daily watt hour usage is 20% of the battery system. If it is more then you run the risk of killing the battery system quicker then you wanted to.

    We try to get people to try out their system using lower cost batteries as a test. If they are correct then they can eventually replace them with a high end battery system.

    All I can say is using an off grid system is a crap shoot.

    Leave a comment:


  • SageOldmann
    replied
    Originally posted by SunEagle
    I have always used the 5 day multiplier to size a battery system. For me it is based on only discharging a battery 20% each day for 3 days and still give me a cushion of 40% SOC. Even that is pretty low for a battery to go but while it may shorten the life it will still be usable.
    I like the way you and Sunking think. I average 4.7 hours of sunlight for 202 days per year with 101 days of full sun and 101 days of partly sunny, so using the 5 day multiplier is just about right for me. So what do you think. Can I Make a 500 Ah battery bank work or should I re-think it all?

    Leave a comment:


  • SunEagle
    replied
    Originally posted by SageOldmann

    Well that is disappointing but I may be able to salvage this. Originally my wife agreed to solar for my equipment, which amounts to about 1,200 watt hours per day, if I allowed her to have some of it for her use as well in her girl cave. (LED tv, computer, etc.) That was another 1,800 watt hours per day. Based on the average usage for both of us that got me to the 3,000 Watt hour number. BUT If I show her comments from an expert with years of education and experience like you have, she may be reasonable and let me have what I need and give up on what she wants. I don’t think it would be a hard sell since it’s impossible for both of us to do this at the same time needing 3,000 watts hours every day. If my math is correct for my 1,200 watt hour needs that looks like a 500 Ah battery might work with my 12 volt system. Maybe???

    I am puzzled by your use of "5" days in your formula. Why not 1 day, or 7 days? Is the 5 days based on averaging with little charging of the batteries on cloudy days?

    You also said
    “5 amps x 120 volts x 5 hours = 3000 watt hours per day and that is a huge number. You are no longer looking at a 12 volt system. with a 15,000 watt hour battery.”
    How many Ah would a 15,000 watt battery be in a 12 volt system?
    How did you get to the Minimum Panel Wattage = 1500 watts?
    I'm intrigued by these numbers.
    I have always used the 5 day multiplier to size a battery system. For me it is based on only discharging a battery 20% each day for 3 days and still give me a cushion of 40% SOC. Even that is pretty low for a battery to go but while it may shorten the life it will still be usable.

    Leave a comment:


  • SageOldmann
    replied
    Originally posted by Sunking
    As it stands now, you do not have anything to work with.
    Well that is disappointing but I may be able to salvage this. Originally my wife agreed to solar for my equipment, which amounts to about 1,200 watt hours per day, if I allowed her to have some of it for her use as well in her girl cave. (LED tv, computer, etc.) That was another 1,800 watt hours per day. Based on the average usage for both of us that got me to the 3,000 Watt hour number. BUT If I show her comments from an expert with years of education and experience like you have, she may be reasonable and let me have what I need and give up on what she wants. I don’t think it would be a hard sell since it’s impossible for both of us to do this at the same time needing 3,000 watts hours every day. If my math is correct for my 1,200 watt hour needs that looks like a 500 Ah battery might work with my 12 volt system. Maybe???

    I am puzzled by your use of "5" days in your formula. Why not 1 day, or 7 days? Is the 5 days based on averaging with little charging of the batteries on cloudy days?

    You also said
    “5 amps x 120 volts x 5 hours = 3000 watt hours per day and that is a huge number. You are no longer looking at a 12 volt system. with a 15,000 watt hour battery.”
    How many Ah would a 15,000 watt battery be in a 12 volt system?
    How did you get to the Minimum Panel Wattage = 1500 watts?
    I'm intrigued by these numbers.

    Leave a comment:


  • Sunking
    replied
    Originally posted by Ampster

    I think what you may be seeing or hearing is that Lithium is more efficient. It can charge faster at 1C versus 0.3C for Pb.
    Who gives a crap AMPSTER? It is a Straw-Man Argument only Green Mafia and Pretenders would make to baffle folks with meaningless BS. You can charge some Lithium faster, others not as fast. Makes no stinking difference. Example if we are talking a 12 volt 100 AH battery, you are saying the Pb can only be charged at a maximum of 33 amps vs 100 Amps. Who gives a crap dumbass? Are you saying you are stupid enough to use a panel 5 times larger than needed? Only Green Mafia or an idiot would use 5 times more power than required.

    Only thing you can do is give a stupid Straw-Man Arguments based on pure fantasy. Sorry but you are not going to get away with your BS here. Go somewhere else to baffle people with your BS. Not going to fly here, your busted, and extremely dangerous.

    Leave a comment:


  • Sunking
    replied
    Originally posted by SageOldmann
    I'm so sorry if I did not make that clear. AND I found a serious math/typo that has followed me in my calculations. It's 5 amps of devices, not 15 amps, at 120 volts for 5 hours. NOW I have the correct number to plug in. Obviously this changes the battery capacity. Is it still the same formula, 5 days x Daily Watt Hours / Battery Voltage?
    Yes sir. No need to apologize. I assume by now you have figured out that math error was kind of a big deal huh?

    5 amps x 120 volts x 5 hours = 3000 watt hours per day and that is a huge number. You are no longer looking at a 12 volt system. with a 15,000 watt hour battery., you are looking at a 24 or 48 volt battery system.

    Minimum Panel Wattage = 1500 watts
    MPPT Controller Size = 30 amps @ 48 volts or 60 amps @ 24 volts
    Battery Capacity = 300 AH @ 48 volts or 600 AH @ 24 volt battery.

    As it stands now, you do not have anything to work with.


    Leave a comment:


  • SageOldmann
    replied
    Originally posted by Sunking
    Sage now I get different units of measure. is it 12 volt battery with 15 amps for 5 hours, or 15 amps at 120 volts. That is a factor of 10.
    I'm so sorry if I did not make that clear. AND I found a serious math/typo that has followed me in my calculations. It's 5 amps of devices, not 15 amps, at 120 volts for 5 hours. NOW I have the correct number to plug in. Obviously this changes the battery capacity. Is it still the same formula, 5 days x Daily Watt Hours / Battery Voltage?

    Leave a comment:


  • Ampster
    replied
    Originally posted by chrisski
    I hope someone will chime in and say if this is correct or not on the first part. Also don't want to derail to OP's thread, and I think these are on track.
    ......

    This is about #4: Also, as far as max discharge rates for batteries, I have not seen much, if any of that on any of the downloads from the manufacturers' sites. Where exactly do I get this discharge rate? Is this some term I'm not familiar with? Perhaps this is a fixed rate based off battery type like SLA or LIFePsO? All I can find is anecdotal evidence that Lithium seems to power a 2000watt inverter better than what his 2000 watt inverter did when he used the AGM batteries. I like tech data.

    I've got up to 5 months before I start wrenching this system togethe, and honestly the battery portion seems to be the hardest for me to figure out. I've seen suggestions to look a 2V or 4V cells for a 12V 400 AH system in series, but there is far less data about these lower voltage batteries than the 6volt or 12 volt. I could be looking in the wrong place.
    The discharge rate and the charge rate should be in the manufacturers specs. They are often couched in terms of C rate. I think there are stickies that explain in more detail. Just make sure you interstate the distinction between 1C and 1/C. They are entirely different.

    I think what you may be seeing or hearing is that Lithium is more efficient. It can charge faster at 1C versus 0.3C for Pb. Lithium does not waste as much energy in heat during the constant voltage phase (Absorb) as Pb. Secondly you can use 80% of the Litihum battery capacity compared to 50% for Pb. That means you can get by with a smaller pack which can offset the higher cost of Lithium.
    I realize some of the above is anecdotal. The literature from the Lithium battery suppliers, once you get past the specs is full of sales info. Pb is a tried and proven chemistry. However there are reasons that all the grid storage and EVs are powered by Lithium. Not to mention the smartphones that have become ubiquitous. Those reasons are efficiency and long term cost of energy storage.

    There are risks with Lithium. If you understand the risks, know how to mitigate those risks then the overall risk can be reduced and compared with the long term cost per kilowatt-hour of stored energy.

    All I am trying to do is give you the framework so you can make the best decision for youself. I have no skin in th game but I have been using Lithium batteries for 25 years without incident.
    Last edited by Ampster; 05-23-2020, 10:38 AM.

    Leave a comment:


  • Mike90250
    replied
    Originally posted by chrisski
    I hope someone will chime in and say if this is correct or not on the first part. Also don't want to derail to OP's thread, and I think these are on track.

    I decided not to use 12 volts into the charge converter, in part because a lot of the charge converters I looked at would not start to charge the batteries until panel voltage exceeded battery voltage by 4 to 5 volts, and then this could drop so panel voltage exceeded battery voltage by one or two volts. I think I could loses a lot of charging hours on a cloudy day because that original Battery + 4 volts was never enough to turn the charge controller on simply because the panel produced 12 volts on an overcast say when the battery was at 12.5 volts.
    All charge controllers require "overhead" to charge batteries. generally, a PV panel with a 20V Vmp is used to charge a 12V battery with a PWM charger
    ( anyone need a glossary at this point) Vmp = Voltage max power. PWM = Pulse Width Modulation

    PV panels produce full voltage with just a small amount of light. Amps are produced from More Light. If you overload a PV panel, it's voltage will sag, so at 6:30 am, there is usually too little light to extract any power from a PV panel. Battery voltage or array voltage has nothing to do with it, it's how may photons are hitting the panel to kick electrons loose.
    Panel Temperature is what regulates the voltage, cold weather, slighter higher voltage, hot weather = a little less voltage.



    This is about #4: Also, as far as max discharge rates for batteries, I have not seen much, if any of that on any of the downloads from the manufacturers' sites. Where exactly do I get this discharge rate? Is this some term I'm not familiar with? Perhaps this is a fixed rate based off battery type like SLA or LIFePsO? All I can find is anecdotal evidence that Lithium seems to power a 2000watt inverter better than what his 2000 watt inverter did when he used the AGM batteries. I like tech data.

    I've got up to 5 months before I start wrenching this system togethe, and honestly the battery portion seems to be the hardest for me to figure out. I've seen suggestions to look a 2V or 4V cells for a 12V 400 AH system in series, but there is far less data about these lower voltage batteries than the 6volt or 12 volt. I could be looking in the wrong place.
    Not all battery mfgs are proud of their product discharge curve ( voltage sag under load ) and lead acid batteries are only good for a couple minutes of high surge. Lithium batteries are better at heave surges & loads for longer time, before they run out of juice. But both can hold the same watt hours of power, lead acid is not as good at delivering it quickly.

    Leave a comment:


  • Sunking
    replied
    Sage now I get different units of measure. is it 12 volt battery with 15 amps for 5 hours, or 15 amps at 120 volts. That is a factor of 10.

    Leave a comment:


  • chrisski
    replied
    I hope someone will chime in and say if this is correct or not on the first part. Also don't want to derail to OP's thread, and I think these are on track.

    I decided not to use 12 volts into the charge converter, in part because a lot of the charge converters I looked at would not start to charge the batteries until panel voltage exceeded battery voltage by 4 to 5 volts, and then this could drop so panel voltage exceeded battery voltage by one or two volts. I think I could loses a lot of charging hours on a cloudy day because that original Battery + 4 volts was never enough to turn the charge controller on simply because the panel produced 12 volts on an overcast say when the battery was at 12.5 volts.

    This is about #4: Also, as far as max discharge rates for batteries, I have not seen much, if any of that on any of the downloads from the manufacturers' sites. Where exactly do I get this discharge rate? Is this some term I'm not familiar with? Perhaps this is a fixed rate based off battery type like SLA or LIFePsO? All I can find is anecdotal evidence that Lithium seems to power a 2000watt inverter better than what his 2000 watt inverter did when he used the AGM batteries. I like tech data.

    I've got up to 5 months before I start wrenching this system togethe, and honestly the battery portion seems to be the hardest for me to figure out. I've seen suggestions to look a 2V or 4V cells for a 12V 400 AH system in series, but there is far less data about these lower voltage batteries than the 6volt or 12 volt. I could be looking in the wrong place.

    Leave a comment:


  • SageOldmann
    replied
    Great information and education. I'm no electrical engineer but this is very helpful and I hope I can figure this all out. I'll do my homework over the weekend and see where I end up. I'm sure I will have additional questions afterwards. Thanks.

    Two things right now though to clarify. You said “The most important step is the first step, determine your daily energy needs in Kwh"
    So I used this calculator I found on-line www.rapidtables.com/calc/electric/Amp_to_kW_Calculator.html to convert my amps for each device to kilowatt hours with these parameters.
    AC single Phase / 15 amps / 115 voltage / power factor of “1”. It gave me 1.75 Kwh. Is this not what I was supposed to do? Isn't this my watt hour needs?

    Also I am calculating based on 5,12 volt panels, not 6. 3 are 100w rated a 5.39 amps each, and 2 are 160w rated at 7.89 amps each, if that makes any difference. That looks like 32 amps to me, not 51 which the formula suggests. Maybe I just don't understand how this exactly works.
    Last edited by SageOldmann; 05-22-2020, 06:13 PM.

    Leave a comment:

Working...