I have noticed that a lot of grid-tie packages (kits) include an array that outputs more power than the maximum output of the inverter.
for example, a 3400 watt array packaged with a Fronius 3000IG inverter. You can even go on to the Fronius website and use their online configurator tool to size your array strings and it actually recommends as "optimum" to use an array that is larger than the inverter.
I called Fronius and they said that if the array is in full (peak) sun and capable of putting out 3400 watts, the inverter will only put the maximum rating to the grid (or your appliances), and in this case 400 watts is going out the window.
I understand the inverters operate most efficiently under higher loads, but when I look at the efficiency curves it looks like they're all pretty good at 50% or greater. So I'm confused as to why you would size an array that is larger than the inverter output. One would hope to have significant time when the sun is shining full.
Is this all because of the discrepancy between standard test conditions and real life?
thanks
for example, a 3400 watt array packaged with a Fronius 3000IG inverter. You can even go on to the Fronius website and use their online configurator tool to size your array strings and it actually recommends as "optimum" to use an array that is larger than the inverter.
I called Fronius and they said that if the array is in full (peak) sun and capable of putting out 3400 watts, the inverter will only put the maximum rating to the grid (or your appliances), and in this case 400 watts is going out the window.
I understand the inverters operate most efficiently under higher loads, but when I look at the efficiency curves it looks like they're all pretty good at 50% or greater. So I'm confused as to why you would size an array that is larger than the inverter output. One would hope to have significant time when the sun is shining full.
Is this all because of the discrepancy between standard test conditions and real life?
thanks
Comment