- #1
Asok_Green
- 13
- 0
This isn’t what you think. I’m not out to ignore or challenge the laws of thermodynamics and I have no free energy fantasies. This is just a thought experiment for anyone willing to indulge a layperson. For those of you thinking, “Go educate yourself. Read a physics book and stop wasting my time,” you’re absolutely right; this is a selfish request. I suspect, though, that there are a few people who might occasionally enjoy explaining things like this to people like me. If that’s you and you’re still with me:
Suppose you’ve got an electrical power source and you use it to create heat (I’m under the impression that it doesn’t much matter how you use it to create heat; it will ultimately have the same efficiency. I don’t know this for a fact, though, so please tell me if I’m misinformed). Amount of electricity X yields amount of heat Y. This heat could, in turn, be harnessed in some way to create amount of electricity Z, but Z would always be less than X (in practice I’m guessing it would end up being a lot, lot less).
But how about if the power source is connected to a thermocouple? It would create a temperature gradient which, when averaged, should be about equal to Y, correct? (I’m really asking, here; I don’t know) But measuring the hot side alone should show an amount of heat greater than Y (I’m more confident about this). If this greater amount of heat were harnessed to create electricity, it should be greater than Z. Now here’s the question: Could it (even theoretically) be greater than X?
My instinct is no, it is not even theoretically possible, but I don’t have any knowledge of thermodynamics (my critical thinking isn’t so sharp, either), so I’m of no consolation to me. Obviously, if the concept would break conservation of energy, there’s not much point in giving it any further thought. But part of me, and again, what do I know? But part of me thinks that it needn’t break conservation of energy, because the extra thermal energy isn’t coming out of nowhere; it’s coming from the cold side of the thermocouple. Now, as soon as the cold side of the thermocouple had transferred most of its thermal energy to the hot side, the whole thing would have to stop working. But if you attached a large heat sink to the cold side of the thermocouple (possibly buried in the ground, depending on the scale of the design), perhaps it could passively gather enough ambient thermal energy to replace that which is being transferred to the hot side of the thermocouple? It would end up being a sort of low-temperature geothermal power plant, assuming the concept isn’t hopelessly flawed.
So, who’s up for explaining how the concept is hopelessly flawed?
Suppose you’ve got an electrical power source and you use it to create heat (I’m under the impression that it doesn’t much matter how you use it to create heat; it will ultimately have the same efficiency. I don’t know this for a fact, though, so please tell me if I’m misinformed). Amount of electricity X yields amount of heat Y. This heat could, in turn, be harnessed in some way to create amount of electricity Z, but Z would always be less than X (in practice I’m guessing it would end up being a lot, lot less).
But how about if the power source is connected to a thermocouple? It would create a temperature gradient which, when averaged, should be about equal to Y, correct? (I’m really asking, here; I don’t know) But measuring the hot side alone should show an amount of heat greater than Y (I’m more confident about this). If this greater amount of heat were harnessed to create electricity, it should be greater than Z. Now here’s the question: Could it (even theoretically) be greater than X?
My instinct is no, it is not even theoretically possible, but I don’t have any knowledge of thermodynamics (my critical thinking isn’t so sharp, either), so I’m of no consolation to me. Obviously, if the concept would break conservation of energy, there’s not much point in giving it any further thought. But part of me, and again, what do I know? But part of me thinks that it needn’t break conservation of energy, because the extra thermal energy isn’t coming out of nowhere; it’s coming from the cold side of the thermocouple. Now, as soon as the cold side of the thermocouple had transferred most of its thermal energy to the hot side, the whole thing would have to stop working. But if you attached a large heat sink to the cold side of the thermocouple (possibly buried in the ground, depending on the scale of the design), perhaps it could passively gather enough ambient thermal energy to replace that which is being transferred to the hot side of the thermocouple? It would end up being a sort of low-temperature geothermal power plant, assuming the concept isn’t hopelessly flawed.
So, who’s up for explaining how the concept is hopelessly flawed?
Last edited: