What is the thermodynamics temperature scale?

In summary, the thermodynamic temperature scale is a real, positive number that is used to relate to the energy of a system in thermodynamics. The ideal gas law can be used to determine temperature in both theory and practice, but it is not the only method. The thermodynamic temperature scale is independent of physical substance, and is based on the efficiency of a Carnot engine and the triple point of water. Negative temperatures can also exist in certain systems, such as a paramagnetic solid in a magnetic field.
  • #71
I justify adiabatic because there is no flow of heat from the universe to the earth-moon system (or vice-versa). I justify equilibrium because the earth-moon system is observed to be stable and unchanging for a long period of time. I justify closed because there is no mass flow into or out of the earth-moon system. Any of those assumptions can be relaxed very easily in the context of thermodynamics, and if you wish to solve that problem, I encourage you to do so- you may learn something useful.
 
Science news on Phys.org
  • #72
You are missing out a whole lot of applications of thermodynamics, if in your view heat is only the most simple version. Of course if you define Q=0 for any process, where you don't have a more abstract idea of heat, then thermodynamics is trivial but useless.

However, scientist can extend thermodynamics in an abstract way. A first little step is to examine bouncing balls:
http://www.aip.org/png/html/maxwell.html
They don't really care about the actual temperature of the sand, right? They extend the concept of heat and entropy. You however would say "The sand has does not exchange real heat and the energy is constant. What's the deal?".
But you can also do calculations with a deck of cards or just any system imaginable. For card of course no one wants to speak about the actual temperature that you would feel with your fingers.
So it requires some more thought to do a useful definition of entropy and heat for arbitrary systems.
It's easy to do with [itex]S=\ln\Omega[/itex]. But you have to think about what a useful form of omega is.
 
Last edited by a moderator:
  • #73
Gerenuk said:
You are missing out a whole lot of applications of thermodynamics, if in your view heat is only the most simple version. Of course if you define Q=0 for any process, where you don't have a more abstract idea of heat, then thermodynamics is trivial but useless.

However, scientist can extend thermodynamics in an abstract way. A first little step is to examine bouncing balls:
http://www.aip.org/png/html/maxwell.html
They don't really care about the actual temperature of the sand, right? They extend the concept of heat and entropy. You however would say "The sand has does not exchange real heat and the energy is constant. What's the deal?".

I honestly have no idea what you are trying to say. Eggers didn't "extend" anything ... he used a medium (sand .. actually 1mm spherical plastic beads), which gives a reasonable approximation to hard sphere collisions of a gas, (in fact his system is called a "granular gas"), in order to visualize partitioning of energy between translational and internal degrees of freedom. The sand on one side has more translational energy, but is internally cooler, whereas on the other side, inelastic collisions have caused some of the translational energy to be transferred to the internal degrees of freedom of the sand grains ... thus those sand grains have absorbed energy in the form of heat, and have less translational energy. Once this process starts, it will tend to continue, because the collection of slower grains on one side is more effective at randomizing the translational energy of a faster grain that may come through the hole. Energy *certainly* is not constant in this case ... they are always shaking the box during the experiments, which means energy is transferred into the system (box and sand grains) from the surroundings. Everything in Eggers' experiment and simulation is completely consistent with the normal definitions of heat, temperature and entropy according to macroscopic thermodynamics, and the observed phenomema are consistent with the second law.

But you can also do calculations with a deck of cards or just any system imaginable. For card of course no one wants to speak about the actual temperature that you would feel with your fingers.
So it requires some more thought to do a useful definition of entropy and heat for arbitrary systems.
It's easy to do with [itex]S=\ln\Omega[/itex]. But you have to think about what a useful form of omega is.

The definitions of heat and entropy are consistently applied across all of thermodynamics; there is no need to "define" them for each system. The statistical models describing the partitioning of internal energy among various degrees of freedom that you seem to be so fond of are useful for understanding *why* the observed laws of thermodynamics are correct for a given system. However, contrary to what you have been claiming, one can certainly apply those observational laws successfully without knowing the detailed microscopic behavior of a given system. That is the beauty of statistical thermodynamics: it converges smoothly into macroscopic dynamics in the limit of large systems. Thus the statistical 2nd law (i.e. fluctuation theorem) is completely analogous to the thermodynamics second law for macroscopic systems.

Finally, despite your claims to the contrary, I do not believe that there are any credible scientific sources who have demonstrated thermodynamic second law violations in macroscopic systems. The examples you have mentioned so far involve spontaneous decreases of entropy for isolated systems *of small size*. These systems are therefore subject to the entropy-decreasing fluctuations predicted by the fluctuation theorem, however, the second law in such contexts has a more general form. It simply says that the time-integrated probability of entropy-increasing fluctuations is always greater than the time-integrated probability of entropy-decreasing fluctuations. Thus, even if all the gas-molecules in your earlier example congregated momentarily in the corner of the box, in the next instant, they would disperse again, and the time-integrated entropy would increase or remain constant (if it had already reached its maximum value).
 
Last edited by a moderator:
  • #74
SpectraCat said:
I honestly have no idea what you are trying to say. Eggers didn't "extend" anything ... he used a medium
I wasn't trying to discuss that particular effect. The point is that thermodynamics can be applied to anything if you know how. Do you know how to apply thermodynamics to a set of red and black marbles? You are contradicting yourself if you don't know that, but claim that all of the universe obeys the second law.

SpectraCat said:
The definitions of heat and entropy are consistently applied across all of thermodynamics; there is no need to "define" them for each system.
You are probably talking about your narrow notion of thermodynamics, where it's about heat that can be measured my a mercury thermometer. Do you know how to deal with the marbles then?

SpectraCat said:
Finally, despite your claims to the contrary, I do not believe that there are any credible scientific sources who have demonstrated thermodynamic second law violations in macroscopic systems.
Did you understand the thread link I posted above?
Of course there is no scientific work that shows that TD is violated for tram schedules, deck of cards, ... That's because no-one has claimed that the second law applies strictly to anything beyond the realm of "common sense heat" measurable by mercury thermometers. Likewise there is no scientific work proving that strawberry cannot be used for rocket fuel.

Here is a question to you:
Do you know how to derive
[tex]S=\sum p_i\ln p_i[/tex]
? If so, then you can check for the assumptions made in this proof. These assumptions hardly apply to anything in the real world.

SpectraCat said:
Thus, even if all the gas-molecules in your earlier example congregated momentarily in the corner of the box, in the next instant, they would disperse again, and the time-integrated entropy would increase or remain constant (if it had already reached its maximum value).
How long do you want to time-integrate? If the particles gather in one corner over and over again, what's the point of saying they don't?
 
  • #75
Gerenuk said:
I wasn't trying to discuss that particular effect.

Then why did you bring up that example and link to the article?

The point is that thermodynamics can be applied to anything if you know how. Do you know how to apply thermodynamics to a set of red and black marbles?

I assume here that you are talking about statistics rather than thermodynamics; they are not quite the same thing, but you don't seem to realize that. I certainly understand how to apply statistics to an ensemble of red and black marbles.

You are contradicting yourself if you don't know that, but claim that all of the universe obeys the second law.

What on Earth are you talking about?

You are probably talking about your narrow notion of thermodynamics, where it's about heat that can be measured my a mercury thermometer. Do you know how to deal with the marbles then?

Oh goody .. more insults from you ... and another incorrect statement. Heat is not what thermometers measure.

Did you understand the thread link I posted above?

Yes, I understand it just fine .. I have referred to it several times, and explained in some detail why it does not have the significance you are ascribing to it.

Of course there is no scientific work that shows that TD is violated for tram schedules, deck of cards, ... That's because no-one has claimed that the second law applies strictly to anything beyond the realm of "common sense heat" measurable by mercury thermometers.

Again you say you think thermometers are used to measure heat ... hmmm. Oh, and people have certainly claimed that the second law applies to everything in the universe ... it is widely held to be one of the most fundamental physical laws, and the least likely to be broken. That fact that you don't realize this is rather strange ... if you could find or construct a system that reliably violated the second law, you could make yourself very rich providing free energy to the world.

Likewise there is no scientific work proving that strawberry cannot be used for rocket fuel.

:eek:

Here is a question to you:
Do you know how to derive
[tex]S=\sum p_i\ln p_i[/tex]
? If so, then you can check for the assumptions made in this proof. These assumptions hardly apply to anything in the real world.

The above is an equation from information theory, and provides the definition for information entropy. As written, it has little to do with physics in particular, just statistics of abstract systems, which I guess is what you are trying to say. However, if you put the Boltzman constant out front, and sum over the energy states of a physical systems, with the pi's as the occupation probabilities of those states, then you have the Gibbs entropy from statistical mechanics. I agree that the proof is purely mathematical, but most proofs are, so I don't understand your point. What gives these equations their significance is the correlation between the variables in the equation and "real entities" that have meaning in the physical world. Yes, the same statistics work for a bag containing marbles of two colors, and a vial containing two atomic gases (assuming the temperature is low enough that only one electronic energy state is populated). So what?

How long do you want to time-integrate? If the particles gather in one corner over and over again, what's the point of saying they don't?

The time interval needs to be long enough so that the event in question has a reasonable chance of occurring. For your gas molecule example, if you are talking about a macroscopic sample (say 10^23 molecules), the probability that they will collect spontaneously in the corner of a box is infinitesimal, so you will need to integrate for an awfully long time, much longer than the lifetime of the universe, in order to have any chance of observing it once, let alone multiple times. Again, this whole exercise is an example of applying the fluctuation theorem in a situation where it doesn't really apply, since we are talking about a macroscopic sample. Furthermore, if you are using the FT, then you should use the version of the second law that is consistent with the FT, as I already explained. So there is no contradiction, and the second law is never violated.
 
Last edited:
  • #76
SpectraCat said:
I assume here that you are talking about statistics rather than thermodynamics; they are not quite the same thing, but you don't seem to realize that. I certainly understand how to apply statistics to an ensemble of red and black marbles.
That what I was saying all along. You have no clue that thermodynamics can be applied to completely arbitrary abstract system. Some approach similar to
http://arxiv.org/abs/math-ph/0003028
where the theory is completely left open whether to apply it to a gas or a set of marbles.
But as I do not get the impression that you are ready to acquire new knowledge, this case is closed for me.
 
  • #77
Gerenuk said:
That what I was saying all along. You have no clue that thermodynamics can be applied to completely arbitrary abstract system.

Nothing I have written is consistent with the statement you make above. I am perfectly aware that this is the case. Read my posts carefully, and you will see that this is true.

Some approach similar to
http://arxiv.org/abs/math-ph/0003028
where the theory is completely left open whether to apply it to a gas or a set of marbles.
But as I do not get the impression that you are ready to acquire new knowledge, this case is closed for me.

I have chosen science as a career precisely because I want to acquire new knowledge. I started debating here with you (and put up with your steady stream of arrogant, insulting statements) because I thought you might have something interesting to say, even if you seemed a bit confused at times. In fact, I do appreciate the link above, that paper is very interesting, and it seems that I will certainly learn something new from it, once I have time to analyze it in detail. However I do want to call your attention to a sentence from the second page,
No exception has
ever been found to the second law of thermodynamics—not even a tiny one.
Like conservation of energy (the “first” law) the existence of a law so precise
and so independent of details of models must have a logical foundation that
is independent of the fact that matter is composed of interacting particles.

That passage in bold is basically what I, and Andy Resnick, and DrDu, and perhaps others have tried to tell you. The second law holds for all cases, even though we may not understand all of the details about why it holds.

You seem to have the bizarre point of view that the standard notions of classical and statistical thermodynamics learned from books are somehow useless, or at least less valuable than the "new" approaches you have been advocating here. To the contrary, the reason that I can read and appreciate the article you posted is because I have a decently thorough understanding of classical and statistical thermodynamics, as it is taught in textbooks.
 
  • #78
Why do cosmologist say that the universe was hotter in the past? How do they know?
What about the entropy of the universe?
 
  • #79
We have to watch out for circular reasoning - that's why the laws of thermodynamics are ordered the way they are. You can't define temperature in terms of entropy unless entropy is operationally defined (i.e. a measurement process is specified) which is independent of the concept of temperature - and that's not likely.

Some things have to be taken as given apriori. We have to assume that we know the mechanical parameters that fully describe the state of a system. P, V, and n in the case of a gas. We have to assume that we can thermally, mechanically, and substantially isolate a system, and that we can thermally, mechanically, and substantially connect two or more systems. Let's ignore the transfer of substance, and just deal with thermal and mechanical aspects. (i.e. assume all systems are substantially isolated). We have to assume that we know when a totally isolated system is in equilibrium: i.e. when its mechanical parameters stop changing after a long period of time. We have to assume we can make state transitions reversibly, i.e. very slowly.

The zeroth law says that you can label every set of systems in thermal equilibrium with each other with a unique label - If the labels match, they are in thermal equilibrium, if not, they are not. These labels can be totally arbitrary, they don't even have to be numbers, as long as they are unique for each set of systems in equilibrium.

The first law states that if you mechanically connect a system to something that does work (-PdV) reversibly on the system, but thermally isolate it, then you can assign an energy change (dU) to the system and the total energy is conserved. You can use the mechanical parameters to come up with an energy equation of state - U(P,V) and dU(P,V)=PdV in the case of a gas. If you now thermally connect the system, you define heat as the difference between the (reversible) work done on the system and the internal energy as measured by U(P,V). So now heat [tex]\delta[/tex]Q is defined: [tex]\delta[/tex]Q=dU+PdV

The second law is then used to define absolute thermodynamic temperature by using e.g. a Carnot cycle. A Carnot cycle can be described without the use of temperature or entropy, needing only the concepts of work, heat, and energy from the first law. It needs no ideal gas, it works for any system. The absolute thermodynamic temperature is then defined as a function of the labels provided by the zeroth law, and its properties are demonstrated using the second law as applied to the Carnot cycle. Now, having defined temperature, you can get to work on the definition of entropy. You can also experimentally observe that in the limit of low density, gases obey the law that PV/n always takes on the same value, and that value is proportional to the thermodynamic temperature, thus defining an ideal gas. Now you can use the ideal gas as a thermometer to measure temperature and calibrate other thermometers.
 
Last edited:
  • #80
Gerenuk said:
How do you justify adiabatic and equilibrium? For that you need a notion of entropy. So how do you define entropy here?

So this thread is open again?
A system is adiabatically isolated, if it is not influenced by its surrounding (other than by work done on it). So it doesn't matter whether the system is brought to the equator or to the north pole. This definition doesn't presuppose any notion of entropy.
The system is in equilibrium if the macro variables which describe it, don't change in time.
I agree that it is a difficult question to decide what are the thermodynamically relevant variables of a system. There are also systems where one has eventually to modify definitions, e.g. thermodynamics of a star, i.e. a big gravitating object or even non-equilibrium phenomena.
 
  • #81
Rap said:
The first law states that if you mechanically connect a system to something that does work (-PdV) reversibly on the system, but thermally isolate it, then you can assign an energy change (dU) to the system and the total energy is conserved. You can use the mechanical parameters to come up with an energy equation of state - U(P,V) and dU(P,V)=PdV in the case of a gas. If you now thermally connect the system, you define heat as the difference between the (reversible) work done on the system and the internal energy as measured by U(P,V). So now heat [tex]\delta[/tex]Q is defined: [tex]\delta[/tex]Q=dU+PdV

For the definition of internal energy it is decisive that the process, while adiabatic, does not need to be reversible. For the same reason I would also generalize Q is defined: [tex]Q=\Delta U-W[/tex]. Which holds also if the process is irreversible and not only for infinitesimally neighbouring equilibrium states.
The definition of U is also not worth a theorem today, as we can in principle measure it directly as the internal energy of a system equals mc^2. So we can in principle measure it using a balance.
 
Last edited:
  • #82
I was just working on thermodynamics and decided to respond without checking the date. Anyway, I agree with everything you said, but I think that to begin with, you have to be able to specify the state without using the quantities defined in the laws. Heat, internal energy, temperature, entropy, etc. That means the mechanical variables only, the ones used to define work. P and V in the case of a gas.
 
  • #83
Rap said:
I was just working on thermodynamics and decided to respond without checking the date. Anyway, I agree with everything you said, but I think that to begin with, you have to be able to specify the state without using the quantities defined in the laws.
I agree on that.
Heat, internal energy, temperature, entropy, etc. That means the mechanical variables only, the ones used to define work. P and V in the case of a gas.
You could in principle also use e.g. refractive index as one of the coordinates which is not directly a mechanical coordinate. So there is still considerable freedom.
 
Back
Top