Inverter Size vs Battery Size Tutorial Part 2

Collapse
This topic is closed.
X
This is a sticky topic.
X
X
 
  • Time
  • Show
Clear All
new posts

  • Sunking
    replied
    Originally posted by aggie

    Please help me find this link.
    Try this one.

    Leave a comment:


  • aggie
    replied
    Originally posted by inetdog
    If you want to do the whole calculation and avoid rules of thumb and assumptions, Dereck (Sunking) has a very complete off grid design calculator linked in another thread that takes just about everything into consideration.
    Please help me find this link.

    Leave a comment:


  • inetdog
    replied
    Now all of the releationships derived between panel power and inverter size make a couple of assumptions, which always happens with rules of thumb.
    It is useful to be aware of what those assumptions are, so you will recognize exceptional situations. (And I do know that everyone thinks that they are exceptional. )

    1. The rule assumes that the panels will be used to recharge the battery bank every day, with a 24 hour cycle which, except for weather, is constant and predictable.
    2. The rule assumes that the only thing that matters for keeping the battery happy is being able to deliver C/10 at some time during the day.

    Both of these assumptions will fail when the loads are short time only or very steady high loads, when the usage pattern is two days of use and five days of recharging, and other such variations. But the general effect of load influences is to require more panel than the inverter size alone would predict.

    If you want to do the whole calculation and avoid rules of thumb and assumptions, Dereck (Sunking) has a very complete off grid design calculator linked in another thread that takes just about everything into consideration.

    Leave a comment:


  • Sunking
    started a topic Inverter Size vs Battery Size Tutorial Part 2

    Inverter Size vs Battery Size Tutorial Part 2

    Well have many of us regulars see this story daily here on the forum. I have a 200 watt panel, 220 amp hour battery @ 12 volts, and a 2000 watt inverter? We see it all the time right? I laugh every time I see it wondering how folks come up with such nonsense.

    Here is the short story. Your Panel Wattage, Battery Capacity, and Inverter Wattage must be matched or they will not work. First sanity check is if your Inverter Wattage is Larger than the Panel Wattage, you most likely have a huge problem on your hands and you do not even know it. Not too mention one hell of a Fire Ball waiting to happen.

    Let's look at the problem with the example I gave above with 200 watt panel, 220 AH battery, and 2000 watt Inverter. A 2000 watt inverter @ 12 volts full power will draw roughly 190 amps. So what happens to the battery when loaded up with a 1C discharge rate where C = the battery Amp Hour Capacity. Well first thing that happens is your 220 AH battery is now a 80 AH battery. But who cares that is not important, just means instead of getting 1 hour run time, you only get 15 minutes on paper. In practice you get 0 minutes.

    Here is a top of the line 6 volt 220 AH battery. We can extrapolate the battery internal resistance from the CA of 888 amps being 7.2 volts / 888 amps = .008 Ohms. Two in series = .016 Ohms. Throw in some connector resistance to the battery post and wire and you are looking at roughly .02 Ohms. Sounds low huh? It is as this is one of the lowest internal resistance 220 AH FLA battery on the market. Reality is the resistance is crippling. Ohm's Law states Voltage = Current x Resistance. So let's hook up the battery to a 2000 watt inverter and see what fun and games we can come up with.

    We start by monitoring the voltage on the battery post with no current flows and we note 12.6 volts or 100% charged up and ready for action. So we connect a 120 watt light bulb that draws 10 amps on the battery and watch the battery voltage. It drops to 12.4 volts. What happened? Well 10 amps x .02 Ohm's = .2 volts lost or about 1.6% of your power and voltage is being lost. Now we crank up the load current to 20 amps or 240 watt load. What happens to the battery voltage? It drops from 12.6 volts down to 12.20 volts. You are now exceeding maximum recommended voltage loss of 3% @ 3.2 %. We are not even counting the voltage loss on the wiring from the battery to inverter. See where this story is going?

    Now lets put in a 1200 watt light and draw 100 amps from the battery. What happens? Well you see a bright 1200 light flash for a second and goes dark. As the current built it way up to 100 amps your battery voltage dropped from 12.6 volts down to 10.5 volts. You inverter shuts off from under voltage at 10.5 to 11 volts. Again not even talking about wire losses between battery and inverter which at 100 amps is now becoming significant.

    So what have we learned here so far. Well first thing a properly designed system limits battery discharge current to about C/10. On a 220 AH battery that is 22 amps. 12 volts x 22 amps = 264 watts. What does that tell you> It tells me the largest inverter I want to run is 250 watts on a 12 volt 220 AH battery.

    What about the battery charge current? Well again for a FLA battery we want to limit charge current to about C/8 maximum and C/12 minimum with C/10 being perfect. What is the amperage of C/12, C/10, and C/12 on a 220 AH battery? Your answer had better be 18.3, 22, and 27.5 amps. So let's just go with C/10 the perfect 22 amps. How much panel wattage does it take to generate 22 amps at 13 volts? Well if you use a MPPT controller the answer is 22 amps x 12 volts = 264 watts. If using PWM and and 12 volt battery panels it takes 18 volts x 22 amps = 400 watts

    So what can we conclude here, and how can we tell when someone has no idea what they are doing? Real simple when we see inverter wattage larger than panel wattage raises red flags. We know Inverter Wattage should be no greater than panel wattage, so when you say you have a 200 watt panel and 2000 watt inverter we laugh and have to tell you some real bad news.
Working...