Dc power loss over a distance of a house?

In summary, the solar panel and LED light do not work together. The panel is not getting enough sunlight to charge the battery, and the light is not working. The battery is close to the LED light, so moving the solar panel to the backyard to get solar powered to charge the battery is an option, but it may take longer to charge than if the wire was shorter.
  • #1
ChuckyMho
3
0
Project : a solar panel powered LED light (with built-in storage battery) with the house facing north (in Canada).

Story
: The kit is bought at Costco.
The solar panel is small (but it provided a 20 ft cable) and separate from the battery and light.
The light (motion and light activated) is at the front door facing north, meant to light up the front path.
Right now the panel is not having much of the sunlight to charge up the battery since it is installed in the side of the house getting a glimpse of sunlight in between houses facing south, with the power cord fully stretched 20'.
The light is simply not working after the battery is drained from factory reserve charge.

Problem
: In the beginning I thought the solar panel would work with ambient light or whatever light is available, but obviously it didn't. So now I am thinking of moving the solar panel to the backyard (south facing) to get solar powered to charge up the battery. I am afraid that just getting an extension DC power cable is not the solution since it is over a long distance of over 60 ft'. Am I right? In order to make this idea work, what gauge of cable do I need? Do I need to up the voltage at the source and down again at the light using transformers? or convert it to AC? I am no electrical engineer and my knowledge of electronics is at high school level. Please give me ideas and solutions with this project if you can, much appreciated!
 
Engineering news on Phys.org
  • #2
Welcome to PF.

You need to figure out how much voltage and current the light needs and the size (gauge) of the wire. Then use the following chart.

As a guess, your light will probably work OK with the longer wire because an LED flood light will need only 1 amp or so. If you buy additional wire, get 14 or 12 guage.

Marine-Wire-1.jpg

The colors in the graph above correspond to the following wire sizes.
Marine-Wire-3.jpg
 
  • Like
Likes ChuckyMho and meBigGuy
  • #3
So that chart says that you would get a 10% voltage drop if you ran 10 amps for 40ft (power and ground) through 14ga wire (bottom of the lower red band)
 
  • Like
Likes ChuckyMho
  • #4
meBigGuy said:
So that chart says that you would get a 10% voltage drop if you ran 10 amps for 40ft (power and ground) through 14ga wire (bottom of the lower red band)

Yes, that's correct, 10% of 12 volts, or a 1.2 volt drop. If the voltage is higher than 12 v, then 1.2 v is less than 10%.

But I'm guessing that the light needs 1 amp or less, especially if the light is LED. There should be a nameplate on the light someplace to tell you.
 
  • Like
Likes ChuckyMho
  • #5
It is better to keep the battery and light close together using a short length of heavier gauge wire, and you can hang the solar panel at the end of the long extension. The current to charge the battery is a tiny trickle compared with the current to power the LEDs.
 
  • Like
Likes ChuckyMho
  • #6
Thanks for your replies anorlunda, meBigGuy and NascentOxygen.
Since the battery is close to the LED light, I don't think power up the LED is an issue, but charging it would be.
I experienced that, in a case of iPhone for example, if the cable is longer, it take longer for the charging to complete, whereas if the cable is a dinky 6 inches, the charging is much faster. So I assume that for a solar light panel, the output is probably very low in terms of amperage and voltage, and if it needs to travel 120 feet (round trips of 60 feet), would it take extra long to charge the batteries? Am I correct? Or this charging fact do not apply to the solar LED case? Please explain if my assumptions are wrong.
 
  • #7
It's a case of try it and see. With luck, you'll notice no difference. After all, the light operates for only a few minutes per night, doesn't it ... just when someone walks past? So it doesn't need many hours of sunlight to replenish the charge that was lost.

You don't need to string the wire out to determine whether your plans will succeed. Without unrolling the reel of cable, just connect the ends of the wire in between the panel and the battery and over a week or so see whether it still works okay where you want to locate the solar panel.
 
  • Like
Likes ChuckyMho
  • #8
That's a very good suggestion, NascentOxygen! Many thanks! I will try that.
But just on the curious side, is there a more mathematical side to this? Just want to do some calculations on this to increase the success rate and getting only the minimal requirement of the materials that I would need. I am trying to deal with this in a more scientific way, reduce the trial and error time and hopefully learn something from this project.
 
  • #9
It is all ruled by E = IR, ohms law.

The resistance of the cable multiplied by the current is the voltage drop.

I think comparing the charging by solar cells to a iphone is not a reasonable comparison. There are a number of reasons an iphone might charge slower, especially with extension cables. It is notoriously picky about cables and cable resistance in more ways than one.

The solar device will charge the battery with a very low current, so the length of the cable to the battery will likely have no effect. That suggestion was a very good idea. Kind of pisses me off I didn't think of it first :smile:
 
  • #10
meBigGuy said to use E=IR which is correct. E is voltage. I is current. R is resistance.

This link http://www.cirris.com/learning-center/calculators/133-wire-resistance-calculator-table
Takes you to a wire calculator. For example, 120 feet of #14 wire (14AWG) has a resistance R=0.303 ohms.

You still have to figure out from the nameplate how much E and I the light needs. EI is the power.

My guess, solar panels that size make 6v, 2.5 watts, which means E=6, I=2.5/6=0.4 amps. IR=0.4*0.303=0.126 v. 0.126/6 is 2% voltage drop. That is pretty negligable.

For a 10% voltage drop, R=3 ohms. 120 feet of #22 wire has a resistance of 2 ohms. That is sufficient.

#14 wire costs about ten times as much as #22 wire.
 

FAQ: Dc power loss over a distance of a house?

What causes DC power loss over a distance of a house?

DC power loss over a distance of a house is primarily caused by the resistance of the electrical wiring and components in the circuit. This resistance results in a decrease in the amount of power that reaches the end of the circuit, known as voltage drop. Other factors that can contribute to DC power loss include the quality of the wiring, temperature changes, and the type of load connected to the circuit.

How does the distance between the power source and the load affect DC power loss?

The distance between the power source and the load is a major factor in DC power loss. As the distance increases, the resistance in the circuit also increases, resulting in a higher voltage drop. This means that the further the power has to travel, the more power will be lost along the way, leading to a decrease in the overall efficiency of the circuit.

What is the best way to reduce DC power loss over a distance of a house?

The most effective way to reduce DC power loss over a distance of a house is to use larger gauge wire for the circuit. This will decrease the resistance and voltage drop, allowing more power to reach the end of the circuit. Additionally, minimizing the distance between the power source and the load can also help reduce power loss. Regular maintenance and proper installation of electrical components can also help to prevent power loss.

Are there any safety concerns associated with DC power loss over a distance of a house?

DC power loss over a distance of a house can pose safety concerns if the voltage drop is significant. This can lead to overheating of the wiring and electrical components, which can cause fires. In addition, a decrease in the voltage reaching the load may result in the load not functioning properly, which can be a safety hazard depending on the type of load.

How can I calculate the amount of DC power loss over a distance of a house?

To calculate the amount of DC power loss over a distance of a house, you will need to know the resistance of the circuit, the distance between the power source and the load, and the amount of current flowing through the circuit. This can be done using Ohm's law, which states that voltage drop is equal to the current multiplied by the resistance. Alternatively, there are online calculators and software programs available that can help with these calculations.

Back
Top