- #1
cuttlefish
- 13
- 0
Homework Statement
a power station delivers 750 kW of power at 12,000 V to a factory through wires with total resistance of 3.0 ohms. How much less power is wasted if the electricity is delivered at 50,000 V rather than 12,000 Volts?
Homework Equations
V=IR
P=V^2/R=I^2R
The Attempt at a Solution
I'm having some basic problems conceptualizing what's going on in this problem. The way I understand it, there's a potential difference of 12,000 V from plant to factory. The wires connecting them act as a resistor of 3ohms. At the factory, 750 kW of power arrives and I'm supposed to find how much power dissipates in the wires and what the difference is if the potential is 50,000 instead.
So I decided to find the potential at the factory, since I assume we don't count the factory as V=0. So:
P=V^2/R so V=sqrt(PR)=sqrt(75kW*3 ohms)=1500 V
So there's been a Voltage drop of 12,000-1500=10500V, so the power dissipated is 3.6 x 10^7 W.
In the case of 50,000 being the potential the potential drop is 48500 W. So power dissipated is 7.8 x 10^8 W. Now when I take the difference of these I get 7.4 x 10^8.
Now, I admit that none of this makes sense because the power lost is greater at 50,000 V than at 12,000, which it obviously shouldn't be. Also, the correct answer is 11kW, so this is horribly incorrect. Can someone steer me towards the right way to think of this problem?