I've got four 12V, 5.5 amp solar panels, and I've wired them in two series sets, which should give me a combined output of 24V and 11 amps. I want to know if my math checks out. I'm planning to run a 100-foot distance to my controller, and with that setup and using 10 gauge wire, I calculate a voltage drop of 2.73 volts. If I shorten the run to 75 feet, the drop is 2.05 volts. Am I on the right track, or missing something? Also, I noticed that with the 75-foot run, my voltage drop would be about 8.5%, while the recommended is ideally 2%. Since my controller converts 24V to charge a 12V battery bank and isn't powering anything like an inverter does, does the voltage drop really matter for charging? Lastly, I can't find anything thicker than 10 gauge for these cables - can I safely charge at either the 100-foot or 75-foot distance? Thanks!
2 Answers
Just keep in mind that the voltage drop you calculated will only happen during peak sunlight hours. For the rest of the day, when the sun isn't as strong, that drop won't be as significant.
You might want to double-check your math because the '12V' rating of those panels is often just a marketing term. They're likely to have a voltage open circuit (Voc) around 20V and a working voltage around 16V. In this case, running at a higher voltage would be better for charging.
I see what you mean, but my controller is specified for both 12V and 24V. So now I’m not sure what to trust!

If it's a marketing term, why do they label it on the panels? That seems misleading!