Calculating heat output from current

In summary, the electrical guy disagreed with the mechanical guys because voltage drop is negligible and the grid system is not flawed.
  • #1
timsea81
89
1
This is actually a heat transfer question but what I don't understand is the electrical part of it.

Given the current (in amps) and the resistivity (in ohm x seconds), how do I figure out how much heat is being generated? I would think you would need to know the voltage as well, but it is not given.
 
Engineering news on Phys.org
  • #2
You don't need to know the voltage but you do need to know the dimensions of the resistor so that you can work out the resistance.

Once you do know the resistance, Power = I 2 * R
 
  • #4
I found the formula for this but was still having trouble grasping it conceptually, but I just realized something.

A few weeks ago I was talking to some mechanical and electrical engineers about combined heat and power. Combined heat and power, aka co-generation or co-gen, is generating electricity on site, usually by burning natural gas, and using the waste heat to heat your buildings, domestic hot water, or some other process.

All of the mechanical guys were saying what a great system it is, and amongst our reasons for this is minimizing voltage drop by generating electricity on site. The electrical guy disagreed, saying that voltage drop is negligible and the grid system is not flawed (although the generation systems certainly are).

Now I realize that the part that I wasn't understanding about the inefficiencies of the grid and the part I was not understanding about this problem are one in the same. The heat generated by electrical current, which represents the energy wasted through voltage drop, depends on current only, not voltage. So when power is transmitted at high voltage, since the current is therefore low, so is the heat dissipation, and so is the wasted energy.

Say you needed to transmit 100,000 VA from point A to point B. If you were to transmit it at 100V, the energy wasted through heat dissipation would be proportional to the 1,000A current. If you were to step up the voltage to 100,000V, the current would only be 1A and the wasted energy would be proportional to 1A. Because heat generation depends on current ONLY, not voltage.
 
  • #5
Transmission guys design for the loss they want to tolerate,
let's just say they can afford to lose 3% of the power along a line.
Power is volts times amps.

At high voltage the current is small so you can use small conductors and get 3% voltage drop. In your example you could tolerate 3kv drop at 1 amp, 3000 ohms.

At low voltage the current is high so you need large conductors to get 3% voltage drop.
In your example that would be 3 volts at 1000 amps which is 0.003 ohms.

Resistance of wire is in proportion to 1/(its area) and area X length is volume of copper required to make the wire...
So for a given line length, 3000/.003 = 1,000,000
A thousandfold increase in voltage yields a millionfold decrease in copper required?
somebody please check my arithmetic and logic...i haven't had my coffee yet.

Textbooks say it's mostly the cost of those huge conductors and the impracticality of towers to support them that makes high voltage economical. Longer insulators are cheap in comparison.
#10 wire is quite close to one milli-ohm per foot. It's about diameter of a soda straw.
To get .003 ohms per thousand feet would be something like three hundred strands of #10, fifteen hundred strands to get .003 ohms per mile..
but a single strand of it is only 5.2 ohms per mile, more than adequate for your high voltage transmission.
I think your example drives home what the textbooks say, but again sanity checks are welcome.


Your mechanical friends are doubtless looking at the thermodynamic cycle.
Exhaust steam from a turbine is condensed into water so you can pump it back into the boiler, and the heat of vaporization is lost. That amounts to ~1000 BTU/pound which is quite a lot.
If you can loan that exhaust steam to a nearby bulding where they can condense it to warm themselves then pump it back to you condensed into water, everybody wins including the environment.



old jim
 
Last edited:

FAQ: Calculating heat output from current

1. How do you calculate the heat output from current?

To calculate the heat output from current, you can use the formula Q = I²R, where Q is the heat output in watts, I is the current in amperes, and R is the resistance in ohms.

2. What is the relationship between current and heat output?

The relationship between current and heat output is directly proportional. This means that as the current increases, the heat output also increases.

3. Can you calculate heat output without knowing the current?

No, the current is a necessary component in the formula for calculating heat output. Without knowing the current, it is not possible to accurately calculate the heat output.

4. How does the resistance affect the heat output in this calculation?

The resistance has an inverse relationship with heat output in this calculation. This means that as the resistance increases, the heat output decreases.

5. Is there a maximum heat output that can be reached with current?

Yes, there is a maximum heat output that can be reached with current. This is determined by the maximum current and resistance values that the material can handle before reaching its melting point or other structural limitations.

Similar threads

Replies
38
Views
3K
Replies
8
Views
2K
Replies
10
Views
1K
Replies
5
Views
2K
Replies
10
Views
3K
Replies
2
Views
818
Replies
4
Views
973
Replies
10
Views
5K
Back
Top