Ac voltage drop accross power lines

In summary: Not shure i got your last question but,The proximity effect is when two cables that are close to each other have a negative effect on the conductivity of the cable. This can be caused by magnetic fields, and can lower the amount of power that can be transmitted through the cable.
  • #1
chopficaro
46
0
so the reason why alternating current is used to deliver energy to households is because for a given wire, the voltage drop is always less with ac rather than dc right?
but when i calculate the voltage drop it coumes out to be the same!
v=IR
R=P(length of wire)/(cross section area of wire)
i see no reason why an alternating current would drop less
is there an equation i am missing?
 
Engineering news on Phys.org
  • #2
No. The reason for using AC is that you can conveniently transform it from the generator voltage to an appropriate voltage for transmission and then for use in the home.

Where did you get the idea that power loss is less? V=IR applies for all signals.
 
  • #3
Actually, DC will have lower I2R losses than AC given that the cable cross section is the same, due to skin effekt.

The reason is like sophiecentaur pointed out, the ability to transform the voltage up and down. Today however power electronics make it posible to transform AC to DC and vice versa effectively which gives DC an advantage over AC in many situations. Underwater cables e.g.

T.A. Edison's proposal of using DC wasn't so dumb afterall. Alltough semiconductors would not see daylight for 80 years later.


In Europe we use 420kV and lower inn distribution-grid, and 230V inn households. The power delivered is the same, but the current much lower (P=V*I), and hence I2R losses lower. Work out the difference in power losses and you see the reason.

Your resistans forumla seems to be missing "rho", ohm-meter.
 
  • #4
SirAskalot said:
Your resistans forumla seems to be missing "rho", ohm-meter.

what? u mean P?
 
  • #5
SirAskalot said:
Actually, DC will have lower I2R losses than AC given that the cable cross section is the same, due to skin effekt.

The skin effect for 60 Hz is about 8.5 mm. http://en.wikipedia.org/wiki/Skin_effect This makes the diameter of wire where the skin effect begins to be noticeable about 19 mm or AWG #00000000.
 
  • #6
Multi-twisted wire with a supporting steal core significantly reduces skin effect at 60 Hz. Skin effect is not as much an issue as with solid conducts, which are not uses for this reason.

To help clarify what sophiecentaur was saying, AC is used because it can be transmitted at higher voltages than needed, and cheaply converted down to useful voltages with transformers. The same cannot be said of DC transmission.

The higher voltage transmission means that less current has to be carried by the conductor per unit power delivered. With less current, comes less power loss in the transmitting cables.

So, in a way, the statement is true: For the same conductor size, AC power will transmit more power, but because it can be transmitted at a higher voltage than the end use.
 
Last edited:
  • #7
what? u mean P?
Sorry, didn't see that. Alltough its a lowercase "P", whereas uppercase P donates power.
 
  • #8
thank you, you've given me more than i asked for, but now i am curious
take a look at the sketch of this wire:
http://www.mahanson.com/images/Hendrix%20Cable.jpg
those copper wires wrapping around the outside must be for reducing the skin effect right?

i have been reading up on the skin effect:
http://en.wikipedia.org/wiki/Skin_effect
i know the equation now, and its also telling me i should look up something called the proximity effect, which may or may not have any effect in this situation

my new question is, that if i know the dimensions of this wire, the thickness, degrees per feet of how the outer wires wrap around etc.
can i calculate the skin effect for this oddly dimensioned cable?
or is the only way to know how different wrapping techniques will effect the skin effect to try them in an experiment?
 
Last edited by a moderator:
  • #9
To my knowledge the cable in the picture has no such function.

Aluminum Conductor Steel Reinforced (ACSR), is the name of the cables used in power-transmission. And looks like http://www.powercablemanufacturers.com/picture/aerial-cable/acsr-cable.jpg" The one in the link is non-insulated and used in overhead powerlines.

Not shure i got your last question but,
For a round cable you only need to know the skin depth, and the current only travels in the outer layer of the cable a distant (skin depth) from the outer side.
For aluminium at 50Hz the skin depth is 12mm. And with a cable larger than 12+12=24mm inn diameter no current travels in the center of the cable. Hence you can fill it with whatever you want.

So why use cables larger than 24mm you say? The area in which the current travels still increase with larger diameter, and the resistance gets lower.

Proximity effect comes into account when two cables lies near each other and their magneticfield acts on each other. This decreases the conductivity of the cables.
 
Last edited by a moderator:
  • #10
ty

one more thing, the wires are wrapped around each other in the middle but not insulated from one another, so do i have to account for some inductance in regards to voltage drop over a distance?
 
  • #11
The cable can be threated as one solid wire, the reason why the cords are small and twisted are due to the flexibility of the cable. No insulation are needed, the steel inside has only the function of mechanical stengthening.

A transmission line isn't purely resistiv, as a whole system (3 phases) it has resistance, inductance and capacitance. And must be threated as a impedance.

Inductance due to proximity effect and self inductance.
Capacitance to earth.

For simplicity, if you don't know complex calculations, just take resistivity into account when calculating voltage drop. But use the correct cross-section if using large cables.(skin effect)

The importance of the impedance becomes important when calculating short-circuit-currents.
 
  • #12
tyvm
 
  • #13
In practical terms, AC is a far better bet than DC for distribution but there is not only the impedance / power factor consideration. The need for synchronisation of multiple generators is always there. That's (at least one reason) why they use a DC link between UK and France and it's worth all the additional rectification and inverters.
Despite the very low frequency of mains AC, lines of a thousand miles or more in length can introduce some strange transmission line effects. For instance, a quarter wavelength long line which is open circuit at one end will look like a short circuit at the other - a potential embarrassment!
 
  • #14
how does setting up a substation between a long distance transmission line reduce transmission losses ?
 
  • #15
If there is no significant step-up in Voltage (e.g 400V to 11kV), followed, at the other end, by a step down then there won't be. It is only when the current flowing in the long line is reduced significantly that IsquaredR losses are reduced.

But a step-up will mean that the consumers down the line may get nearer to their required supply volts.
 
  • #16
ohk, another query why are transmission voltages in factors of 11 ? if it's because of form factor ... how does it depend on the form factor please give me a formula for calculating the required transmission voltage according to distance of transmission and load 2 b transmitted..
 
  • #17
I don't know where the 11 comes from. I suspect, though, that is as simple as giving yourself 10% headroom and starting from an arbitrary 100V. But the original 250V (or the subsequent 240V) in the UK is not a multiple of 11 and neither is 400kV.

Someone, somewhere may have sat down and done some sums but I bet the values were pretty well arbitrarily chosen.
aamof, with the comparatively long domestic voltage distribution lines in the US, it's surprising that 110V was chosen when you think that lines in the UK are, on average, a lot shorter because most housing is a lot more dense, in comparison and we chose 250V.
 
  • #18
I think the voltages are a bit arbitrary. 110V may have been adopted after a choice of 100V and then an allowance for voltage drop on inadequate supply cables. Strange that the UK chose 250V, originally, then went to 240 and then to 230, for unity with Europe. No multiples of 11 there - and neither on the 400kV standard.

The choice of ratios for low, intermediate and high voltage transmission would have been based on cost of towers, cable and a lot of other factors but in a pretty fuzzy way, I'm sure. We really need someone (like my Dad, for instance) who was around when some of these standards were brought in (not the first ones, of course - I'm not that old).

It always seemed strange to me how the 110V was chosen in the US, where spacing between dwellings is relatively high but 250V was chosen in the densely populated UK where cables would have been significantly shorter, on average, with lower resistive losses. Were the Americans so rich that they could afford all that extra copper?
 
  • #19
sophiecentaur said:
Despite the very low frequency of mains AC, lines of a thousand miles or more in length can introduce some strange transmission line effects. For instance, a quarter wavelength long line which is open circuit at one end will look like a short circuit at the other - a potential embarrassment!
I'm not sure what you mean here. Can you expand on that a bit? Preferably with a diagram of some kind.

You are referring to the wavelength of the AC signal with respect to c (speed of light), correct? If I remember c right, 60 Hz has a 5km wavelength, 50 Hz is 6km. So a quarter wavelength line would only have to be a couple miles long at most, not thousands.
 
  • #20
I think you need to redo that math Jiggy. 1/4 wave at 60 hertz is in hundreds of Km.
 
  • #21
Averagesupernova said:
I think you need to redo that math Jiggy. 1/4 wave at 60 hertz is in hundreds of Km.
Bah, my bad. 300,000 km/s, not 300,000 m/s. Stupid units.
 
  • #22
sophiecentaur said:
It always seemed strange to me how the 110V was chosen in the US, where spacing between dwellings is relatively high but 250V was chosen in the densely populated UK where cables would have been significantly shorter, on average, with lower resistive losses. Were the Americans so rich that they could afford all that extra copper?

I would love to know the origin of that, have never seen anything specific in print.
here in Australia and New Zealand we also use the 220-240V standard.
In my travels around the USA, I was very suprised at the "interesting" cabling in some homes at 110V. Made me cringe when one considers that the current flowing at 110V is twice that than at 220V for an appliance of the same wattage.
Mind you on trips to the Philippines (my wife's home land) looking at the way 110V is strung around that and connected into with basic barrier connectors really freaks you out haha

Dave
 
  • #23
davenn said:
I would love to know the origin of that, have never seen anything specific in print.
here in Australia and New Zealand we also use the 220-240V standard.
In my travels around the USA, I was very suprised at the "interesting" cabling in some homes at 110V. Made me cringe when one considers that the current flowing at 110V is twice that than at 220V for an appliance of the same wattage.
Mind you on trips to the Philippines (my wife's home land) looking at the way 110V is strung around that and connected into with basic barrier connectors really freaks you out haha

Dave

I made a post about the origins not long ago. I don't recall what thread though.
-
Found it, here it is:
https://www.physicsforums.com/showthread.php?t=461471

The reason 100 volts was picked (if you read my link) was because it was decided that 100 volts was high enough to be practical to do the job by keeping the current low enough to prevent excessive loss but 100 volts was low enough to be considered relatively safe. I recall reading these things just a few days before I posted in that thread but I can't recall where. Try wiki I suppose.
 
Last edited:
  • #24
davenn said:
I would love to know the origin of that, have never seen anything specific in print.
here in Australia and New Zealand we also use the 220-240V standard.
In my travels around the USA, I was very suprised at the "interesting" cabling in some homes at 110V. Made me cringe when one considers that the current flowing at 110V is twice that than at 220V for an appliance of the same wattage.
Mind you on trips to the Philippines (my wife's home land) looking at the way 110V is strung around that and connected into with basic barrier connectors really freaks you out haha

Dave
But the voltage is still half that. And it's voltage that causes current to flow. I can't figure out why people think that the higher amps are dangerous.

Looking at a naive, extreme example, which would you rather handle: 1kV@1mA or 1mV@1kA?* Each one is only 1W, but I think there's a big safety difference there.

If we were dealing with constant current sources and our bodies were being put in series with the load, then the higher current would be bad. 1mA would be hardly noticeable, but 1kA will cook your goose pretty tender (if it doesn't explosively boil away).

But, the vast majority of power sources are constant (somewhat) voltage sources, and most of the time (I think at least) a shock happens because our bodies provide a path to ground in parallel with the load. In the parallel case, it's the voltage that's applied across the body. 1mV is harmless, but 1kV is serious stuff.

By my thinking, 1mV@1kA would be much, much safer to handle than 1kV@1mA. That's the reason why the long distance transmission wires that carry the huge voltages (hundreds of kilovolts) are put so high up and so carefully insulated, even though they carry less current; it's the voltage value that's dangerous.

From this analysis, I conclude that 240V is more dangerous than 120V; though 120 is still plenty dangerous.

If my thinking is too naive, I'd love to be corrected.

Unless I misinterpreted you, and you were referring to the higher current creating more heat and increasing the risks of electrical fires and the like. In which case this entire post is pretty misguided.

* Yes, I realize that these numbers are highly unrealistic, but I've found that the quickest way for me to understand things is to talk about what happens at the extreme ends rather than muck around with a dozen middle-of-the-road examples.
 
  • #25
I would generally say that higher voltage lower current is more dangerous. But, think about the poor guy who pops the hood on his car and accidentally gets his metal watch band between the positive terminal of the battery and a metal part of the car. Watch band is red hot in a second.
 
  • #26
Averagesupernova said:
I would generally say that higher voltage lower current is more dangerous. But, think about the poor guy who pops the hood on his car and accidentally gets his metal watch band between the positive terminal of the battery and a metal part of the car. Watch band is red hot in a second.
The short from the watch is in parallel with the load, so it gets the full 12V across it. Since a metal watch would likely have very low resistance, that puts a huge amount of current through the watch. P = IV, and there's a lot of heat coming out of that watch.

If the voltage from the battery was very very low, say 1mV, than even with the low resistance, there wouldn't be anywhere near as much current.

That's the point that I'm getting at. Because shocks happen most from your body or other piece of material being in parallel with the source, the danger from an electric shock is determined by the voltage of the lines, not the current they carry.
 

FAQ: Ac voltage drop accross power lines

1. What causes voltage drop across power lines?

Voltage drop across power lines is caused by the resistance of the transmission lines, which results in the loss of electrical energy as it travels from the power source to the end user. This resistance is influenced by factors such as the material and diameter of the wire, the length of the transmission line, and the amount of current flowing through it.

2. How does voltage drop affect power transmission?

Voltage drop can negatively impact power transmission by reducing the amount of usable energy reaching the end user. This can lead to a decrease in voltage, resulting in equipment malfunctions and power outages. Additionally, a high level of voltage drop can cause inefficiencies and increase maintenance costs for power companies.

3. Can voltage drop be prevented?

While voltage drop cannot be completely prevented, it can be minimized through the use of larger diameter wires, shorter transmission lines, and proper maintenance of the power grid. It is also important for power companies to carefully monitor the flow of electricity and adjust the voltage as needed to compensate for any potential drop.

4. How is voltage drop calculated?

Voltage drop can be calculated using Ohm's Law, which states that voltage drop is equal to the product of current and resistance. This means that as the current flowing through the transmission line increases, the voltage drop will also increase. Power companies use specialized equipment to measure the voltage drop and make necessary adjustments to maintain a consistent voltage level.

5. How does voltage drop in power lines affect my electrical appliances?

Voltage drop in power lines can have a direct impact on electrical appliances and devices. If the voltage is too low, appliances may not function properly or may not work at all. On the other hand, if the voltage is too high, it can cause damage to sensitive electronics. It is important for power companies to maintain a consistent voltage level to ensure the safe and efficient operation of electrical appliances.

Similar threads

Replies
10
Views
3K
Replies
23
Views
3K
Replies
11
Views
2K
Replies
32
Views
1K
Replies
24
Views
4K
Replies
7
Views
4K
Back
Top