Power Voltage Current: Ohm's Law & Decreasing Power Loss

Click For Summary
The discussion centers on the relationship between power, voltage, and current as defined by Ohm's Law and the power formula. It clarifies that while increasing voltage can lead to a decrease in current, this is beneficial for reducing power loss during electricity transmission due to lower resistance losses. Power companies typically transmit electricity at high voltage and low current to minimize these losses, maintaining a consistent voltage for consumers. The conversation also touches on the control power companies have over voltage and current during transmission. Ultimately, the goal is to design systems that minimize energy loss while delivering a fixed voltage to end users.
grandia3
Messages
2
Reaction score
0
hello, I'm new here =D
dunno if this is the right section to ask this
I'm really confused about power voltage and current

according to ohm's law, V = I x R
therefore if we increase voltage, the current will increase

but according to the power formula, P = V x I
they say increasing voltage to deliver electricity to houses, will decrease the current therefore decreasing the power loss along the way
as the resistance across the wire will not change (at least not significant) therefore it still follow the ohm's law

how can this be possible?

thanks
 
Physics news on Phys.org
From Ohm's law and P=VI

Powerloss,P=I2R. Usually they transfer electricity at high voltage/low current. Low current=> Low powerloss in a given resistance.
High Current=> High powerloss in a given resistance.

It's more feasible to transfer it at high voltage/low current. It doesn't mean that because V is high will mean that I will be low, power companies could just transfer it at high voltage/high current. But that won't be good.
 
hmm that means that the power company can control both the voltage and the current transferred?

that does make sense =D

thanks
 
I'm not too sure of the extent to which they can control it, I think it's only high voltage/high current OR low voltage/high current they can transfer it in. Ignore the high voltage/high current part.

You could also check the formula for the ideal transformer where

\frac{V_s}{V_p}=\frac{I_i}{I_s}

Where Ip=Current in the primary circuit
Is= Current in secondary circuit.
Vp=Voltage in the primary circuit
etc.

The ratio of Vs/Vp is constant such that if Vs is high it means that Is would be low and so forth
 
You're just mixing scenarios, that's all. They are different equations for different situations.

If you have a light and you vary the voltage (assuming you don't burn it out), you can use both equations and you'll find that you get more current and therefore more power as the voltage increases.

But the power company doesn't want to vary the voltage at your house, they want to keep it at 120, but design a system that loses less along the way. So the power is fixed as a matter of practicality. So when they design the transmission lines and can choose the voltage they want, they pick a higher voltage to lower the amperage and decrease the IR2 losses. But that doesn't mean the power increases: at your house, you still get 120V.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 25 ·
Replies
25
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 7 ·
Replies
7
Views
9K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K