OK just before the professionals shoot this argument down in a spectacular fireball let me make a valid point: MONEY IS AN ISSUE HERE!
I was just wondering under what circumstances a MPPT charge controller actually improves energy effeciency?
I'll give you my scenario:
2x300 watt panels connected in parallel. Nameplate specifications on each of the panels are: Open circuit voltage is 44.8 volts, Power voltage is 36.6volts. Power current is 8.2 amps, short circuit current is 8.69 amps. OK now this is at standard lab conditions (25C/77F). Despite this I have seen up to 21 amps on my LCD under clear and cool conditions when the sun is far from perpendicular to my panels! According to the nameplate specs I should never see more than 16 to 17 amps becuase my charge controller is not an MPPT...?
Now with 600watts solar and a MPPT I could reasonably expect to see a maximum of 25 amps when my battery voltage is at 24V and no more than 20A when at 30V under boost charge right? I've already seen 29.6V/20.9A with my cheapie PWM controller, thats over 600watts and we haven't even taken losses into account yet!
Now what I need to know is firstly this: I did my installation a month ago which here in the southern hemispere is just before the winter solstice - the weather is cool, sun is weak. In summer I can expect much hotter panels which according to the manufacturer will reduce their output voltage significantly and increase the current slightly. So in that scenario is an MPPT going to acheive that much? The batteries are going to need 31volts for their monthly equalization charges anyway and with cooking hot solar panels they may only just about reach that voltage, then of coarse there will be losses along the 60feet of wiring reaching from the roof to where the batteries are sitting.
I Paid $150 for the PWM controller but an MPPT will set me back about $500. Is it worth my while?
I was just wondering under what circumstances a MPPT charge controller actually improves energy effeciency?
I'll give you my scenario:
2x300 watt panels connected in parallel. Nameplate specifications on each of the panels are: Open circuit voltage is 44.8 volts, Power voltage is 36.6volts. Power current is 8.2 amps, short circuit current is 8.69 amps. OK now this is at standard lab conditions (25C/77F). Despite this I have seen up to 21 amps on my LCD under clear and cool conditions when the sun is far from perpendicular to my panels! According to the nameplate specs I should never see more than 16 to 17 amps becuase my charge controller is not an MPPT...?
Now with 600watts solar and a MPPT I could reasonably expect to see a maximum of 25 amps when my battery voltage is at 24V and no more than 20A when at 30V under boost charge right? I've already seen 29.6V/20.9A with my cheapie PWM controller, thats over 600watts and we haven't even taken losses into account yet!
Now what I need to know is firstly this: I did my installation a month ago which here in the southern hemispere is just before the winter solstice - the weather is cool, sun is weak. In summer I can expect much hotter panels which according to the manufacturer will reduce their output voltage significantly and increase the current slightly. So in that scenario is an MPPT going to acheive that much? The batteries are going to need 31volts for their monthly equalization charges anyway and with cooking hot solar panels they may only just about reach that voltage, then of coarse there will be losses along the 60feet of wiring reaching from the roof to where the batteries are sitting.
I Paid $150 for the PWM controller but an MPPT will set me back about $500. Is it worth my while?
Comment