So ... The Lady and I are planning on going off-grid in three or so years. In advance, we've purchased some assets to take the living room off-grid this winter, to gain a real sense of how much power we consume, how much "real world" juice we can generate from our system.
We currently have a pair of 240w panels, in series, feeding into a Midnite Classic 150 MPPT controller. That's feeding a pair of 6V, 370Ah Trojan batteries wired in series.
The second panel and the controller are a few days new: previously had a single panel feeding through a cheap-o controller into the batteries. That setup was working for about a month, maybe, with light drain.
We have a generic 12v modified sine wave inverter powering the living room computer and a light. The entire batch, at worst, draws 140 Watts.
The last few days, the controller has pushed 700-900 watts into the system daily. However, the voltage reading on the batteries seems to plummet sharply under minimal use.
For example, I left the house at 6:30 p.m., with the voltage at 12.9. The Lady was watching "Bachelor" for three hours, online. Max draw, call it 140 watt-hours. Voltage was down to 12.5. Now, after about four hours continual use, the voltage is down to 12.1, which by my count is quite discharged.
My math says the wife has really only drawn, at max, 600 watt-hours out of the system - probably closer to 500 watt-hours, or, say, 40 amp-hours. That should really only represent about 12-15 per cent of the total charge of the 370Ah batteries, but it seems to be representing almost a 75 per cent discharge, if my knowledge of the voltage readings is accurate.
What's worrying me is that the daily input is far exceeding the daily draw, and yet the batteries seem ill-charged. I've set all the voltages on the charge controller accurately, and it is dumping as much as 380 watts - and usually 20-25 amps - into the batteries around noontime.
Thoughts?
We currently have a pair of 240w panels, in series, feeding into a Midnite Classic 150 MPPT controller. That's feeding a pair of 6V, 370Ah Trojan batteries wired in series.
The second panel and the controller are a few days new: previously had a single panel feeding through a cheap-o controller into the batteries. That setup was working for about a month, maybe, with light drain.
We have a generic 12v modified sine wave inverter powering the living room computer and a light. The entire batch, at worst, draws 140 Watts.
The last few days, the controller has pushed 700-900 watts into the system daily. However, the voltage reading on the batteries seems to plummet sharply under minimal use.
For example, I left the house at 6:30 p.m., with the voltage at 12.9. The Lady was watching "Bachelor" for three hours, online. Max draw, call it 140 watt-hours. Voltage was down to 12.5. Now, after about four hours continual use, the voltage is down to 12.1, which by my count is quite discharged.
My math says the wife has really only drawn, at max, 600 watt-hours out of the system - probably closer to 500 watt-hours, or, say, 40 amp-hours. That should really only represent about 12-15 per cent of the total charge of the 370Ah batteries, but it seems to be representing almost a 75 per cent discharge, if my knowledge of the voltage readings is accurate.
What's worrying me is that the daily input is far exceeding the daily draw, and yet the batteries seem ill-charged. I've set all the voltages on the charge controller accurately, and it is dumping as much as 380 watts - and usually 20-25 amps - into the batteries around noontime.
Thoughts?
Comment