Interesting argument between friends

In summary: Yes, a lot of energy is used in putting up a Christmas tree. A lot of energy is used to move ornaments, to buy lights, to install them, to power the tree stand, and to power the lights themselves.
  • #1
JustNobody
5
0
So recently I have gotten in a heated argument with my friend about heating his apartment (no pun intended). Here is the argument:

My EE friend has a theoretical heater which runs on 1000 watts of power. In his apartment he also has another electronic device (ie. a computer, a television ect) which also requires 1000 watts to run. He argues that, due to conservation of energy, 1000 watts is 1000 watts and the same amount of heat will be produced from his "device" as from his heater (and thus if his theoretical device is always running at 1000 watts he will never need to turn out his heater to heat his house).
Being an EE AND a Physics major I disagree with him stating that the 1000 watts of energy is dissipated in ways OTHER than heat. While a heater's primary purpose is to provide as much resistance as possible (dissipating electric power as heat as heat), another device such as a computer uses its power via other methods (not via "heat") but I don't know exactly how to explain it to my stubborn friend (who just keeps shouting V=IR at me).
If I am wrong could someone please justify to me why. And if I am right please provide a good explanation of where the "power is going" so that I may show my friend and convince him that heating his house with a 1000 watt device that ISN'T a heater is a dumb idea.


Thank you,
 
Engineering news on Phys.org
  • #2
Tell him to take physics again because he clearly missed the point. Yes, 1000 watts is 1000 watts, but a heater turns (most) of that energy to thermal energy whereas any other device converts some of that energy to some meaningful work. But you want an example.

So let's say we have a (massless) elevator (on earth), then the power used to do work ON a person BY the elevator is

[tex]
P = \frac{dW}{dt} = \frac{d}{dt} \int \vec{F} \cdot \vec{dx} = \frac{d}{dt} mgx = mgv
[/tex]

where m is the mass of the person and v is the velocity. Let's say its a 100 kg person going 1 m/s. Then P = 1000 W (approximating g as 10 m/s^2). No heat transfers at all. Of course this is an ideal situation, but your friend was using an "ideal" heater, so whatever. Obviously the heater would do a better job heating things up.
 
  • #3
Winter is here, tell your friend that he does not need a heater, just keep working on the computer, have the tv, stereo, coffee maker etc. on, he'll be warm!:devil:

When he is half way frozen, then tell him to re-access his pursuing EE major. Wow, this is scary!

Yes, you can use Llello's mechanical work done example or many other examples. Work can be done in many forms other than heat.
 
  • #4
Of course posts #2 and #3 are right in theory, but your friend seems to be the better practical engineer IMO. The amount of "other forms of work" done by a computer or a TV is negligible, unless they are faulty and generating lots of RFI, or you stand them in front of a window so the light energy from the display can escape from the room :smile:

I just put my hand near the back of the PC I'm using right now (which only has a 300W power supply, not 1000) and there's a fan blowing a stream of nice warm air out of the computer case and into the room. That's exactly the same as what a 300W fan heater would do!
 
  • #5
Can you guys provide an example of where the computer is using the energy in a form that is not thermal? I tried to use the example that a heater provides large resistance ( which dissipates the energy in thermal energy via collisions) and in a computer this is not the case (at least not nearly as much), but my friend is not convinced by this argument.
 
  • #6
Of cause you get heat from computer, but it is disproportional the the wattage as computer is not an efficient heat generator. That's the reason I said turn on computer, tv everything...I should have said to make up 1000W. You are not going to get the amount of heat out of a 1000W heater.

His friend is not arguing how much heat from computer etc. He is arguing about you get the same heat from 1000W of power usage by other equipments as from a 1000W heater. If you get only 200W worth of heat power, you are not getting the same as a 1000W heater.

Case in point, Christmas is coming, I put a lot of lights and moving ornaments on the tree, it's about 10 strands of 100 light bulbs, over 30 motorizing ornaments and a train set under. I don't know how much power it use, it is not low, try stand next to it and see whether you feel even the slightest of heat! AND I am not kidding, I take the Christmas tree very very serious!:smile: I actually did paid attention whether the tree get warm, it is not even slightly warm. You'll get a lot more heat from a single 100W tungsten bulb.
 
Last edited:
  • #7
JustNobody said:
Can you guys provide an example of where the computer is using the energy in a form that is not thermal? I tried to use the example that a heater provides large resistance ( which dissipates the energy in thermal energy via collisions) and in a computer this is not the case (at least not nearly as much), but my friend is not convinced by this argument.

How about executing few mega instructions a second, power needed to charge the stray capacitance of the signal lines inside the circuit boards to get the speed up.

Also, get a 100W/channel stereo amp and run full blast, use an ear protection and sit in front of it and the speaker and see whether you get warm! I have a 3 channel 200W per channel amp, it is inside a 2'^3 confined space. After I have it on for a few hours, the inside might be like 5 degs F or so higher. If I open the door, the heat is gone.
 
  • #8
JustNobody said:
Can you guys provide an example of where the computer is using the energy in a form that is not thermal? I tried to use the example that a heater provides large resistance ( which dissipates the energy in thermal energy via collisions) and in a computer this is not the case (at least not nearly as much), but my friend is not convinced by this argument.
By not being able to think of an example, you proved your friend right: You can't think of an example because there are none. All but a negligible amount of the energy used by a computer is turned into heat.
 
  • #9
yungman said:
How about executing few mega instructions a second, power needed to charge the stray capacitance of the signal lines inside the circuit boards to get the speed up.
All of that is dissipated as heat.
Also, get a 100W/channel stereo amp and run full blast, use an ear protection and sit in front of it and the speaker and see whether you get warm! I have a 3 channel 200W per channel amp, it is inside a 2'^3 confined space. After I have it on for a few hours, the inside might be like 5 degs F or so higher. If I open the door, the heat is gone.
Odds are it doesn't run at anywhere close to 200W, but all of the energy it produces, except what sound energy escapes your house (negligible) is converted to heat.
 
  • #10
russ_watters said:
All of that is dissipated as heat. Odds are, it doesn't run at anywhere close to 200W, but all of the energy it produces, except what sound energy escapes your house (negligible) is converted to heat.

I said crank it up high. That is sound power, not heat power.

As for charging the capacitance of the signals, capacitor don't generate heat except the parasitic resistance when current flow that is real. The signal trace and input of a gate is not particular that lossy.
 
  • #11
yungman said:
I said crank it up high. That is sound power, not heat power.
Even if you crank it up to high, it still doesn't use a steady 200W/channel and the vast majority of the energy is dissipated as heat out the back, not as sound out the front. And then virtually all of the sound power is absorbed by objects in the room (and the room itself) and turned into heat. Quick google:
Loudspeaker efficiency is defined as the sound power output divided by the electrical power input. Most loudspeakers are inefficient transducers; only about 1% of the electrical energy sent by an amplifier to a typical home loudspeaker is converted to acoustic energy. The remainder is converted to heat, mostly in the voice coil and magnet assembly.
http://en.wikipedia.org/wiki/Loudspeaker
 
  • #12
russ_watters said:
Even if you crank it up to high, it still doesn't use a steady 200W/channel and the vast majority of the energy is dissipated as heat out the back, not as sound out the front. And then virtually all of the sound power is absorbed by objects in the room (and the room itself) and turned into heat. Quick google: http://en.wikipedia.org/wiki/Loudspeaker

You always have a way to run it at high power. Another way is playing guitar and bass with solid state amp, we crank it up high. You don't get the heat of a few hundred watts. Believe me, we blast them.

And as I add onto the last post, charging into a capacitance don't generate heat in the capacitor unless you have parasitic resistance.

Try take a few computers that draw the same power as a heater and see whether you get the same amount of heat out.
 
  • #13
Back to electronics, remember Poynting Theorem on EM energy transmitted out, there is a EXH that is power transmitted by the EM wave, and there is a thermal part that is the power loss due to resistive loss? Only the resistive loss become heat. This is physics of conservation of energy. Remember, signal traveling in circuit is EM wave propagation, there is Poynting Theorem involve.
 
  • #14
yungman said:
Try take a few computers that draw the same power as a heater and see whether you get the same amount of heat out.
That's one of the things I do for my job as a heating and air conditioning engineer. I figure out how much air conditioning is needed in a lab or data center by measuring or adding up from nameplate data how much electricity is going into it. Data centers are the easy ones, since the energy density is so high and the heat inputs and outputs from other sources (like the walls), so low in comparison, the only piece of information you really need to size the cooling is the wattage of the computer equipment in the data center.
You always have a way to run it at high power. Another way is playing guitar and bass with solid state amp, we crank it up high. You don't get the heat of a few hundred watts. Believe me, we blast them.
I believe you've never measured the electrical input nor the heat output of your system.
 
Last edited:
  • #15
yungman said:
Back to electronics, remember Poynting Theorem on EM energy transmitted out, there is a EXH that is power transmitted by the EM wave, and there is a thermal part that is the power loss due to resistive loss? Only the resistive loss become heat. This is physics of conservation of energy. Remember, signal traveling in circuit is EM wave propagation, there is Poynting Theorem involve.
And where does that EM wave go? Are they continuously being created but not going anywhere? Or are they created and then dissipated as heat?
 
  • #16
russ_watters said:
And where does that EM wave go? Are they continuously being created but not going anywhere? Or are they created and then dissipated as heat?

No, as I explained, EM wave convert to current to charge up or discharged the capacitance of the signal trace and input capacitance of the MOSFET and those are electrical energy, not heat. Remember ideal capacitor don't dissipate power? Only part that turn to heat is the series resistance of the line or the resistance inside the capacitor WHICH we use the notion of loss tangent. This is the biggest thing about requiring power to increase speed of the computer, it's the charging and discharging the line capacitance. If you run the computer at half the speed, the power drawn will be a lot lower.

And besides, how can you justify all power transform to heat? Even a heater is not 100% efficient. You need 100% efficient to transform 1000W of electrical power to heat power. A computer is not going to be optimize to transform power into heat, neither are other equipment like motor for moving a weight...which is mechanical power, not thermal power.

For your assertion to be true, ALL electrical equipment HAS to have the same efficiency of transforming electrical power to heat...WHICH still cannot be 100%.
 
  • #17
yungman said:
No, as I explained, EM wave convert to current to charge up or discharged the capacitance of the signal trace and input capacitance of the MOSFET and those are electrical energy, not heat. Remember ideal capacitor don't dissipate power?
If it doesn't dissipate power then it doesn't show up on the electric meter.

And besides, how can you justify all power transform to heat? Even a heater is not 100% efficient.
For all intents and purposes, an electric heater is 100% efficient.
You need 100% efficient to transform 1000W of electrical power to heat power. A computer is not going to be optimize to transform power into heat...
As I asked before: where else does the electrical power go? Above you said it went to things like charging a capacitor, but then said it didn't.
...neither are other equipment like motor for moving a weight...which is mechanical power, not thermal power.
If the weight doesn't end up somewhere different from where it started, all the energy is dissipated as heat.
For your assertion to be true, ALL electrical equipment HAS to have the same efficiency of transforming electrical power to heat...WHICH still cannot be 100%.
If your system is a closed box and there are no chemical reactions occurring, conservation of energy demands that all of the energy become heat.

But there's no need to continue arguing this. You can pull a spec sheet for a server and see for yourself. Here's one, randomly googled: http://h20000.www2.hp.com/bc/docs/support/SupportManual/c01038153/c01038153.pdf

Page 54:
Electrical input: 764 Watts
Heat output: 2604 BTU

According to my math, they don't quite match: converting the BTUs gives me 763 Watts, but that's close enough to consider it rounding error.

I also have similar (though not as exact) data for a lab centrifuge. Not all manufacturers give both the watts in and BTUs out though, because, frankly, they'll assume that the reader knows they are the same thing.
 
Last edited by a moderator:
  • #18
yungman said:
Another way is playing guitar and bass with solid state amp, we crank it up high. You don't get the heat of a few hundred watts. Believe me, we blast them.

yung, i think you know better than this. think: conservation of energy. what sound doesn't make it out of the room, what light from the computer that doesn't make it out a window, all of that energy will eventually be converted to heat. the question is where.

i doubt more than a couple of watts will make it out. pretty much all of the power the computer and electronics draws will end up heating the room just as well as a space heater of the same amount of power.
 
  • #19
russ_watters said:
Not all manufacturers give both the watts in and BTUs out though, because, frankly, they'll assume that the reader knows they are the same thing.
Not to mention that the most humans measure heating power in watts anyway. Even the Brits have stopped using BTUs :smile:
 
  • #20
So we stray from the point. In the defense of my point, I argued that a heater have as much resistance as possible. This would ideally lead to large amounts of inelastic collisions with the lattice which would generate an irreversible thermal process "heat". When an engineer designs a heater they want the conversion from electrical power to "heat" to be as high as possible.

However when an engineer designs a computer they want their conversion from electrical power to "heat" to be as small as possible. Ideally, a computer engineer would wish to design a computer which dissipates no heat (which is impossible). I believe I remember from Solid State Physics that for silicon there is a way to calculate how much energy is used to "power" a device and how much is used to heat it. If I recall it was very simple, something like if the electron requires 6eV to overcome the band gap and its given 7eV then the electron has a final kinetic energy of 1eV which is expended as thermal radiation (the leftover KE creates collisions in the lattice → heat).

Regardless of this specific example, I believe that if you think about it microscopically rather than microscopically, you see that a lot of the energy is going towards different processes that AREN'T heat. In the long run all energy WILL turn into heat (entropy ftw), however I'm talking about amounts and time scales on the order of "will this device heat my apartment more than my heater" in which case I don't think it will.
 
  • #21
AlephZero said:
Not to mention that the most humans measure heating power in watts anyway. Even the Brits have stopped using BTUs :smile:
True, but for the American market, most will Americanize their catalogs. Ironically, the science is done in SI, but the HVAC engineering is still in English units. More importantly, there is no English analogue to electrical power in watts. So my calculation spreadsheets therefore have to include both and flip back and forth.
 
Last edited:
  • #22
JustNobody said:
So we stray from the point. In the defense of my point, I argued that a heater have as much resistance as possible. This would ideally lead to large amounts of inelastic collisions with the lattice which would generate an irreversible thermal process "heat". When an engineer designs a heater they want the conversion from electrical power to "heat" to be as high as possible.

However when an engineer designs a computer they want their conversion from electrical power to "heat" to be as small as possible. Ideally, a computer engineer would wish to design a computer which dissipates no heat (which is impossible). I believe I remember from Solid State Physics that for silicon there is a way to calculate how much energy is used to "power" a device and how much is used to heat it. If I recall it was very simple, something like if the electron requires 6eV to overcome the band gap and its given 7eV then the electron has a final kinetic energy of 1eV which is expended as thermal radiation (the leftover KE creates collisions in the lattice → heat).
In case you missed it because of all the posts, here's just the bottom line:
Russ said:
But there's no need to continue arguing this. You can pull a spec sheet for a server and see for yourself. Here's one, randomly googled: http://h20000.www2.hp.com/bc/docs/support/SupportManual/c01038153/c01038153.pdf

Page 54:
Electrical input: 764 Watts
Heat output: 2604 BTU
Heat output = electrical input.
Regardless of this specific example, I believe that if you think about it microscopically rather than microscopically, you see that a lot of the energy is going towards different processes that AREN'T heat. In the long run all energy WILL turn into heat (entropy ftw), however I'm talking about amounts and time scales on the order of "will this device heat my apartment more than my heater" in which case I don't think it will.
For a computer, "the long run" is on the order of nanoseconds to microseconds for everything except the fans, which convert their power to heat in a few seconds.

And again: you still haven't thought of such a process, have you?
 
Last edited by a moderator:
  • #23
Sorry I haven't read all the replies in this thread as I have limited time today, so apologies I end up repeating things that have already been said, but I'll add my two cents worth anyway.

The simple answer is that unless you have a device that is either,

1. Converting electrical energy into stored energy (such as chemical or gravitational potential). For example a battery charger.

or

2. Exporting power by design. Say for example I had a 2kW motor inside the house, but exporting that power through a drive-shaft to outside in the yard where it was driving something.

then the vast majority of electrical energy consumed by any appliance (TV, Computer, Vacuum cleaner etc) *will* be converted to heat within the house.

Take the TV example given by the OP. A TV consuming say 250 watts will have all but at most a few watts converted to heat. No energy is exported or stored so therefore it must all be converted to heat + light + sound and RFI within the house (mostly heat). Though the energy involved in the light and sound will be small, most of it will be absorbed and converted to thermal energy in the walls and furnishings anyway. Typically less than one or two watts would escape the house as light or sound, and a negligible amount would escape as RFI. In summary, 99% of the electrical energy consumed by the TV would normally be converted to heat within the house!
 
  • #24
Tell your friend he is a fool, you are exactly right. Power of a heater is designed to only heat a house/room etc. Power from a PC power source is to power a PC, heat is just an undesired side effect.

Is like he is saying a tank made of solid Uranium and a race car made of carbon fiber have the same engine, so they have to go the same speed.
That's very wrong.

Tell your friend to put on a coat and stop arguing.
 
  • #25
I think most of you are overestimating the efficiency of the electronics used in our homes. I tend to agree with Russ. Even if I didn't know Russ is an HVAC engineer I would still agree with him.
 
  • #26
Averagesupernova said:
I think most of you are overestimating the efficiency of the electronics used in our homes.
It's not even a question of efficiency. Unless you can identify how the energy is leaving the house, either as light or sound or RFI, then it is being converted to heat within the house.

As I mention above, the only real exceptions to this are devices that either store or export power by design, a battery charger for example.
 
  • #27
uart said:
It's not even a question of efficiency. Unless you can identify how the energy is leaving the house, either as light or sound or RFI, then it is being converted to heat within the house.

As I mention above, the only real exceptions to this are devices that either store or export power by design, a battery charger for example.
Right: So the way I would put it, using your term, is that the energy exporting efficiency of household devices is near zero, which makes the heating efficiency near 100%.
 
  • #28
Averagesupernova said:
I think most of you are overestimating the efficiency of the electronics used in our homes. I tend to agree with Russ. Even if I didn't know Russ is an HVAC engineer I would still agree with him.
:redface: I don't really like to do argument from authority, but in this case someone challenged whether I have ever tested this. The fact of the matter is that I have.

Data centers are easy: what causes debates in my company is fan energy. But not how much heat is added; WHERE it is added (dissipated).
 
  • #29
russ_watters said:
Data centers are easy: what causes debates in my company is fan energy. But not how much heat is added; WHERE it is added (dissipated).

The suspense is killing me,

Ima guess on the fan blades,
 
  • #30
russ_watters said:
I don't really like to do argument from authority [...]
To qualify as an argument from authority, you'd have to base your argument on the fact that you are an authority on the subject matter. E.g., "I'm a HVAC engineer, therefor I'm right." You clearly didn't, so strictly speaking it wasn't an argument from authority. In fact, I think you've made a pretty strong case here without it!
 
  • #31
nitsuj said:
The suspense is killing me,

Ima guess on the fan blades,
That's where the electrical energy is transferred to the air (via the motor and fan shaft, which are better than 90% efficient together), but what I meant is where is it converted to heat. There is no consensus in my office.

All agree that due to the inefficiency of the fan, a decent fraction immediately becomes heat. But what about the rest? My position is that it is converted everywhere you see a pressure loss, in proportion to the loss: across coils and dampers, due to friction in ducts, etc.
 
  • #32
In an ideal case with a computer actively consuming 1000W and a heater actively consuming 1000W the heat produced would be fairly close. Most power consumed by a computer is converted to heat. The exceptions I can think of are a minor amount stored in capacitors, spinning of disc drives/fans, LEDs and speakers, power that leaves to go to external to the computer (Ethernet for example), and stray radiation. In all of those cases the end result will be some additional heat generation (friction from the disc drives for example), but some is converted to mechanical, stored, or radiated energy.

Where the thought experiment falls short in reality is a computer with a 1000W power supply will rarely if ever actually draws 1000W.
 
  • #33
I am not a power engineer. Please explain from my drawing what is the power drawn from the battery. Assuming capacitor C is ideal and lossless, assuming the on channel resistance of the MOSFET is 0Ω so no ohmic loss.

When IN is low, Q1 turns on, voltage from the battery is connect to the C and charge it up as shown in Loop1. When IN switch to high, Q1 turn off and Q2 turn on, the C is discharged by Q2 through Loop2.

Q1, Q2 and C are inside an isolated enclosed place in the dashed box in . Battery and the IN driver is outside of the box.

1) What is the power input to the box when IN is pulsing to charge and discharge the cap continuously?

2) If there is real power input, how is heat generated inside the box as there is no resistance?

3) If there is no power input, do we expect the battery to last forever in this ideal case?
 

Attachments

  • cap_power_L.png
    cap_power_L.png
    22.2 KB · Views: 587
  • #34
yungman said:
I am not a power engineer. Please explain from my drawing what is the power drawn from the battery. Assuming capacitor C is ideal and lossless, assuming the on channel resistance of the MOSFET is 0Ω so no ohmic loss.

why not assume it has negative resistance?

the issue is, yung, that when that cap discharges, the energy stored in the cap ends up somewhere. where do you think it goes?

better clip on some heat sinks to the MOSFETs.
 
Last edited:
  • #35
a similar question is had when you get some big capacitor (say, 1000 uF and that is good for 50 volts) and you charge it up with a DC supply to, say 40 volts, disconnect the DC supply and then, using a screwdriver, short out the cap. what happens? where did the energy come from? and where did it go?
 

Similar threads

Back
Top