One of the things I did recently with my overpriced science fair project (Outback hybrid grid-tie with 24 panels and 410 Ah AGM battery) was to make a current-measuring shunt using the 5-foot 4/0 negative battery cable. I have a very sensitive USB-connected voltage measuring device (Yocto-milliVolt-RX in case anyone feels like looking it up) that I connected to some wire leads at each end of the cable. I put a 27 Ohm resistor between the wire leads where they reach the voltage sensor (low impedance to minimize inductive coupling effects), followed by an RC lowpass consisting of two 22K resistors and a 1 uF film capacitor across the sensor inputs.
With some conversion based on the standard 4/0 copper wire resistance at room temperature and the actual measured temperature near the battery, it provides very stable current readings that match up with what my FlexNet DC current meter is showing, with one significant exception: The resistance of my battery cable appears to be about 14% higher than what all the tables say is the resistance (for that length and temperature) of 4/0 annealed copper wire! There's no way the wire was so much hotter than its surrounding, with modest current flow (0-30A), as to explain that discrepancy.
My wire leads are attached right at the lugs on the ends of the cable, so not even contact resistance to the battery terminals explains it.
Could there really be that much variation in resistance of a not-cheap cable that was factory-made, crimps and all, for connection to high-current batteries? And if so, from what? Isn't the copper used in these cables all the same? Is it possible that the factory-crimped connections at each end are adding a consistent 14% to the total resistance?
With some conversion based on the standard 4/0 copper wire resistance at room temperature and the actual measured temperature near the battery, it provides very stable current readings that match up with what my FlexNet DC current meter is showing, with one significant exception: The resistance of my battery cable appears to be about 14% higher than what all the tables say is the resistance (for that length and temperature) of 4/0 annealed copper wire! There's no way the wire was so much hotter than its surrounding, with modest current flow (0-30A), as to explain that discrepancy.
My wire leads are attached right at the lugs on the ends of the cable, so not even contact resistance to the battery terminals explains it.
Could there really be that much variation in resistance of a not-cheap cable that was factory-made, crimps and all, for connection to high-current batteries? And if so, from what? Isn't the copper used in these cables all the same? Is it possible that the factory-crimped connections at each end are adding a consistent 14% to the total resistance?
Comment