I see a brewing debate here about wiring sizing.
There are some physics involved. Copper isn't a perfect conductor - it has some resistance - and the higher gauge (thinner) the wire the more resistance you'll see for a given amount of current.
You can think of the copper wire like a big heating element - no matter what, some of the power you are putting into it is going to get dissipated - in the form of heat. You turn on the power, the wire heats up a little bit, and you can actually measure that the voltage drops a little bit between one end of the wire and the other.
For copper cable, the three major variables that matter to calculate voltage drop are the length of the cable, the diameter / gauge of the cable and the number of amps you intend to run through it. There is more to it than this - ambient temperature, whether you are running in conduit, solid vs. stranded, AC vs DC etc - but these first three factors are the most important for this discussion.
There are many calculators available on the internet that will allow you to precisely calculate the amount of voltage you will lose for a given choice of length / gauge / amps.
So here is where the confusion probably comes in. When wiring up normal 120V electrical systems, the typical standard is to size your wiring for 5% or less voltage drop.
The normal rules of thumb, which just about everyone who has done a lot of residential electrical work are used to following, are 14 gauge is adequate for 15 amps, 12 gauge is adequate for 20 amps, 10 gauge for 30 amps, 8 gauge for 40 amps, 6 gauge for 55 amps, etc. There are plenty of calculators on the Internet that will tell you what sized conductors you need to deploy to stay within the 5% voltage drop limit.
So why are the guys here talking about what seems to folks used to working with 120V systems like crazy thick and expensive cables? Because voltage drop does not vary with input voltage.
Here's an example -
For a 25 foot long, 50 amp cable at 120 volts (6,000 watts) - plugging this into a calculator you find that if you use a 6 gauge cable, you should expect to see a voltage drop of about 1.2 volts - for a 1% transmission loss. You cable will be radiating about 60 watts of energy over it's length - think about the heat of a 60 watt lightbulb spread out over 25 feet - no big deal.
But for the very same setup running at 50 amps at 12 volts (600 watts) - the voltage drop is still 1.2 volts! - which is a transmission loss of 10.2%. So you are wasting a lot of your precious solar power. Note that this is not dangerous - you are still radiating the exact same 60 watts of energy over 25 feet - it's just a much higher percentage of the power you started with.
If you step up to 2 gauge cable - which is about as thick around as your index finger - you will get down to 4% transmission loss - 24 watts. And so on. But is this worth it do to? It depends on the price of thicker copper wire vs. the price of more solar panels.
I am sure that 10 years ago when copper cost less than half of what it does today, and solar panels cost 4X what they do today, it was a no brainer to always deploy very heavy gauge cables.
Note this same thinking should also make it obvious why larger solar systems don't run at 12 volts - you want to run at the highest voltage you can between the panels and your charge controller, so you will have the least percentage transmission loss (and so you don't really need crazy thick cables) - and you want your charge controller very close to your batteries, so when you are forced to step down to 12 volts, you lose the least amount of power possible.
I hope this helps.
|