- #1
Coolamebe
- 13
- 1
Ok, so the title was pretty vague, I'm not sure how to succinctly describe the confusion. Anyway, so I've learned that the power lost is P=I2R, and so by increasing the voltage, as P=VI and is constant, the current will be lowered, and thus the power lost will decrease.
I'm confused about a couple things. While my physics teacher was specifically talking about P=I2R, should not P=V2/R also give the value, and so by increasing the voltage we increase the power lost? Is this not a contradiction?
I feel like it could be remedied if the wires in power lines are not ohmic conductors and so half the math I did above is invalid.
Anyway, any help would be greatly appreciated, thank you!
I'm confused about a couple things. While my physics teacher was specifically talking about P=I2R, should not P=V2/R also give the value, and so by increasing the voltage we increase the power lost? Is this not a contradiction?
I feel like it could be remedied if the wires in power lines are not ohmic conductors and so half the math I did above is invalid.
Anyway, any help would be greatly appreciated, thank you!