Gibbs paradox: an urban legend in statistical physics

In summary, the conversation discusses the misconception surrounding the mixing of distinguishable particles and the existence of a paradox in classical statistical physics. The speaker discovered that there is no real paradox, and this is supported by various references and papers. The correction factor of 1/N! in the entropy calculation is not a paradox, but a logical necessity. The factor is imposed by logic, not to obtain an extensive entropy, but due to combinatorial logic. This is different from Gibbs' definition of entropy in terms of volume of phase space. Within classical mechanics, there is no logical reason to include this factor, but in quantum theory it is a result of indistinguishability. This correction factor persists in the classical limit and is a natural result of Bolt
  • #141
vanhees71 said:
As I said several times the inclusion of the correct factor ##1/N!## is due to quantum-mechanical indistinguishability of identical constituents (atoms/molecules, etc).

What about coloids, granular systems, or a computer program that solves Hamiltonean dynamics? You can not appeal to quantum mechanics to include the ##1/N!## in the entropy of these systems.

vanhees71 said:
If you have a system consisting of multiple distinguishable constituents you have to count accordingly, as shown in the example with the separated volume in the Gibbs-paradox setup.

I do not follow. What you mean by "you have to count accordingly"?
 
Physics news on Phys.org
  • #142
For a computer program that solves Hamiltonian dynamics I don't need to evaluate the entropy, I just solve the equations.

If particles are distinguishable you count them as distinguishable. Where is the problem?
 
  • #143
vanhees71 said:
For a computer program that solves Hamiltonian dynamics I don't need to evaluate the entropy, I just solve the equations.
Would you say that the fact that you don't need to evaluate the entropy means that you can't evaluate the entropy?

vanhees71 said:
If particles are distinguishable you count them as distinguishable. Where is the problem?

My problem is that I would like you to clarify if this way you count them as distinguishable is consistent with the way Ehrenfest counted them in 1920, vam Kampen counted in 1984, and Swendsen counted in 2009?
 
  • #144
vanhees71 said:
We discuss in circles. Just once more: For distinguishable particles there IS mixing entropy!
Just to be sure.
You agree that the buckballs of HPt are quantum distinguishable particles?
Do you agree that in the setting proposed by HPt where each pair of a large number of particles is distinguishable there will be no mixing entropy?
Don't you think that your statement can be misleading? Specially considering that your post was number 37 while post 35 was
HPt said:
No, QT is not needed. In section 4 of my paper I show that there is no entropy increase when mixing distinguishable identical classical particles.

I would say that I take issue with you (apparently) contradictin HPt's conclusion without addressing his arguments.
 
  • #145
autoUFC said:
Would you say that the fact that you don't need to evaluate the entropy means that you can't evaluate the entropy?
My problem is that I would like you to clarify if this way you count them as distinguishable is consistent with the way Ehrenfest counted them in 1920, vam Kampen counted in 1984, and Swendsen counted in 2009?
It is at least very difficult to calculate the entropy within a (quantum) molecular dynamics simulation.

I don't know how all these people counted. Can you tell me the specific problem you have with counting for multicomponent systems? I've not understood, where you think there is a problem.
 
  • #146
vanhees71 said:
It is at least very difficult to calculate the entropy within a (quantum) molecular dynamics simulation.

I don't know how all these people counted. Can you tell me the specific problem you have with counting for multicomponent systems? I've not understood, where you think there is a problem.

I see no problem. The arguments of those authors show that there is no entropy of mixing in the classical model of particles.

They also show that you do not need to appeal to quantum mechanics to justify the inclusion of the ##1/N!## in the definition of entropy for this model. A clever extension of this same argument was employed by HPt to show that the problem of distinguishable particles persists in quantum mechanics. Of course, in the setup proposed by HPt, ##1/N!## is there not due to the symmetries of the identical quantum particles (HPt's particles are not identical) . As in the cases of Ehrenfest, vam Kampen, and Swendsen, ##1/N!## is there due to the the fact that the particles are distinguishable.

The arguments of Swendsen are presented in Wikipedia. The arguments of vam Kampen are discussed in the paper of HPt.

I was ignorant of this until recently. Maybe most people reading this post see no surprise in learning something new. As I said, I was shocked. I am not aware of any book of statiscal mechanics that explains this precisely. In fact, as vam Kampen mentions, several Books say that you NEED to appeal to quantum mechanics. That is not true, as explained.
 
  • #147
autoUFC said:
However, I do not understand why you say that there is a paradox. Wouldn't you say that equations 24 and 31 of your paper show that there is no paradox? As you said in other post, there is no entropy of mixing for the model of a gas of classical identical particles. Also you show that there is no entropy of mixing in the gas of pairwise distinguishable quantum particles. What paradox remains?

A paradox, as I understand it, is a seeming contradiction between two statements that are both believed to be true. In section 2 of my paper I frame the Gibbs paradox in such a way that it reveals a contradiction between the conventional entropy calculation and the second law. I resolve this contradiction by showing where the counting of microstates in the conventional calculation goes wrong.
By contrast, most textbooks "resolve" this contradiction by denying its prerequisite, that is, they claim that identical particles must be indistinguishable. This statement is true in QM, but neither resolves the Gibbs paradox for classical particles (which may be conceived as identical but still distinguishable) nor for pairwise non-identical quantum particles, which can be treated like identical distinguishable particles if they are sufficient similar (such as the buckyballs in my paper). Other authors, such as Jaynes, "resolve" the contradiction by modifying the entropy expression, i.e. they don't adhere to ##S=k \ln \Omega## .

hutchphd said:
Can you give the gist of the argument here?

Here's an ELI5 attempt:
Suppose you have 4 different buckyballs a,b,c and d in a volume ##V##. Now you insert a partition in ##V##, thereby dividing ##V## into two subvolumes ##V_1## and ##V_2##. For simplicity, let's ignore all entropy contributions that stem from the uncertainty about location and momentum. Let's also ignore the entropy contribution that stems from the uncertainty about the particle numbers in ##V_1## and ##V_2## by assuming that each subvolume contains 2 buckyballs. With these simplifications, the only remaining entropy contribution stems from the uncertainty about the particle composition. So, what is the entropy of ##V_1##? Well, you have to count the microstates and apply ##S=k \ln \Omega## . The microstates are
  1. a and b are in ##V_1## ("ab" for short)
  2. ac
  3. ad
  4. bc
  5. bd
  6. cd
Hence, the entropy of ##V_1## is ##k \ln 6##.
A similar counting yields the same entropy for ##V_2##.
Now, what is the entropy of ##V##? Let's count the microstates again:
  1. a and b are in ##V_1##, c and d are in ##V_2## ("ab|cd" for short)
  2. ac|bd
  3. ad|bc
  4. bc|ad
  5. bd|ac
  6. cd|ab
So, the entropy of ##V## is also ##k \ln 6##.
Obviously, the entropy of ##V## ist not the sum of the entropies of ##V_1## and ##V_2##. Why is that? Because ##V_1## and ##V_2## are correlated: Whenever, e.g., a is in ##V_1## it cannot be in ##V_2##, that is, a microstate such as ab|ad is not possible.
This simple example shows that the entropy of systems of distinguishable particles is not additive. In my paper I show that this non-additivity nicely compensates the non-extensivity, that systems of distinguishable particles are afflicted with, and resolves the seeming contradiction known as the Gibbs paradox.
 
Last edited:
  • Like
Likes dextercioby
  • #148
HPt said:
Let's also ignore the entropy contribution that stems from the uncertainty about the particle numbers in V1 and V2 by assuming that each subvolume contains 2 buckyballs.
Let's not. What do you get?
 
  • #149
hutchphd said:
Let's not. What do you get?
For large particle numbers, the entropy contribution that stems from the uncertainty about how many particles are located in each subsystem is negligible. For that reason, it isn't of relevance in the present context of the Gibbs paradox. Nevertheless, as an aside, this uncertainty is responsible for why even the entropy of an ideal gas of indistinguishable particles is not strictly extensive. You only get the extensive Sackur-Tetrode equation, if you apply Stirling's formula to the ##\ln (1/N!)## term and ignore all its subleading corrections.
 
Last edited:
  • #150
HPt said:
This simple example shows that the entropy of systems of distinguishable particles is not additive.
That is not what this shows at all. We can I think agree that this demonstration would not be interesting for identical particles.
But for the distinguishable particles the lack of additivity comes from the correlative restraints between the "subsystems". They are simply not statistically independent. incidentally if you partition it 1ball|3ball the answer is also the same. Also not interesting
 
  • #151
autoUFC said:
So we have two possibilities.
1) you start alread in equilibrium when you remove the partition. Them you have no entropy increase. You may put the partion back and remain in equilibrium. In any moment you have extensivity.

2) you start out of equilibrium when you remove the partition. Them you have the partial preassures of each kind of gas to define the macroscopic state of each partition. Again, you have extensivity.

In the mixing scenario that has been under discussion throughout this entire thread, we are talking about 2). In your 1) above, the partial pressure of both gases must be the same on both sides of the partition, and that is not the initial condition we have been discussing. Of course if the gases are already mixed, you can insert and remove the partition as much as you like and it won't change anything--not entropy, and not the partial pressures of either gas. But that is not the scenario we have been discussing.

In the scenario we have been discussing, i.e., 2) above, the partial pressure of each gas is zero on one side of the partition, and positive on the other side. So yes, this partial pressure differential can be viewed as driving the mixing process. And, as I have said before, if you include a chemical potential term in your analysis (which here will include the partial pressure of each gas), you can keep track of the mixing process with it.

However, note carefully that, in case 2) as I have just described it, there is still mixing entropy. The process of mixing is still not reversible. And Jaynes' general point about all such cases still holds: in such cases it doesn't even make sense to talk about extensivity of entropy, because macroscopic variables are changing.
 
  • Like
Likes vanhees71
  • #152
autoUFC said:
You said that Jaynes writes that entropy of mixing and vapor pressure are instances of non-extensivity. Can you tell us what in Jaynes text gave you this impression?

Jaynes' discussion in sections 5, 6, and 7 of his paper shows, to begin with, that assigning any entropy at all to a system depends on how we choose to partition its microstates into macrostates. His discussion throughout the paper also shows that extensivity of entropy is not derived from first principles in most treatments, but simply assumed without argument. Probably the best general statement he gives for what you come up with when you try to actually derive conclusions about extensivity of entropy from first principles is item (b) at the bottom of p. 12 of the paper.

The discussion of mixing in section 5 of Jaynes' paper also makes the important point that, to assign entropy to a system at all, we have to make assumptions about how the microscopic state space of the system is partitioned in terms of macroscopic variables. Which means that assignments of entropy depend on our technological capabilities, since those determine what we consider to be the relevant macroscopic variables. So claims like "entropy is extensive", even in cases where they are true, must be understood as claims about our current knowledge, not inherent properties of the system. Jaynes' discussion of how one experimenter with technical capabilities that a second experimenter lacks can create effects that look to the second experimenter like second law violations is instructive.
 
  • Like
Likes vanhees71
  • #153
autoUFC said:
Most books and all my professors suggest that an extensible entropy could not be defined for distinguishble particles.

Btw, to take the discussion back to your original post that started this thread, Jaynes agrees with you that it is possible to define an extensive entropy for distinguishable particles. He shows how to do it.
 
  • Like
Likes vanhees71
  • #154
HPt said:
Hi, I'm the author of the paper Eur. J. Phys. 35 (2014) 015023 (also freely available at arXiv). As I explicity demonstrate in my paper, there is a paradox that manifests as a contradiction to the second law
PeterDonis said:
In the scenario we have been discussing, i.e., 2) above, the partial pressure of each gas is zero on one side of the partition, and positive on the other side. So yes, this partial pressure differential can be viewed as driving the mixing process. And, as I have said before, if you include a chemical potential term in your analysis (which here will include the partial pressure of each gas), you can keep track of the mixing process with it.

However, note carefully that, in case 2) as I have just described it, there is still mixing entropy. The process of mixing is still not reversible. And Jaynes' general point about all such cases still holds: in such cases it doesn't even make sense to talk about extensivity of entropy, because macroscopic variables are changing.

In the problem of thermal contact macroscopic variables are also changing. However you said that exchange of heat is not a violation of extensivity. Why would exchange of particles be?
 
Last edited:
  • #155
PeterDonis said:
The discussion of mixing in section 5 of Jaynes' paper also makes the important point that, to assign entropy to a system at all, we have to make assumptions about how the microscopic state space of the system is partitioned in terms of macroscopic variables. Which means that assignments of entropy depend on our technological capabilities, since those determine what we consider to be the relevant macroscopic variables. So claims like "entropy is extensive", even in cases where they are true, must be understood as claims about our current knowledge, not inherent properties of the system. Jaynes' discussion of how one experimenter with technical capabilities that a second experimenter lacks can create effects that look to the second experimenter like second law violations is instructive.

Yes. Jaynes is instrutive. However, I do not agree completely with the way he states his arguments. Maybe I misunderstand, but reading this discussion I have the impression that he argues that increase in entropy due to mixing depends on having the capability of segregating the mixed gases.
That gets us to the uncomfortable situation where the entropy of a system depends on our current capabilities.There is the question of a large number of pairwise distinguishable particles. We have agreed that in this case entropy may of may not increase in mixing. As you said, if one ignores where each of the particles are before mixing, there is no entropy increase due to mixing.

I say aditionely that entropy increases only if you have a simple rule to describe how the particles are separated before mixing. My particular way of thinking is that our capabilities only show how to convert the free energy before mixing in work. But entropy in a mixing process will increase as long as there is a simple way to describe the initial state.

Maybe these two ways to understand this question are like interpretations of quantum mechanics. That is, they lead to the same predictions.

If you have a simple way to determine how particles are segregated before mixing, you could have semi-permeable menbranes that only let through particles that where initialy in one subsystem. With these semi-permeable membranes you can convert free energy into work.

If the only way determine how particles are segregated before mixing is with a list of the initial positions of each particle, a semi-permeable membrane would be a Maxwell demon.

Jaynes interpretation is based on the experimenter capacity. The interpretation I am most comfortable is based on the experimenter information.
 
  • #156
PeterDonis said:
Btw, to take the discussion back to your original post that started this thread, Jaynes agrees with you that it is possible to define an extensive entropy for distinguishable particles. He shows how to do it.
Yes. We know that including the ##1/N!## leads to an extensive entropy, and Jaynes (following Pauli) tells us that. However, he does not present the arguments based on the inclusion of the permuation term that accounts for the ignorance of what particles are in what subsystem after the barrier is removed.

As he writes
"Note that the Pauli analysis has not demonstrated from the principles of physics that entropy
actually should be extensive; it has only indicated the form our equations must take if it is. "
 
Last edited:
  • #157
autoUFC said:
I see no problem. The arguments of those authors show that there is no entropy of mixing in the classical model of particles.
But there must be mixing entropy for non-identical gases which were separated by the divider wall and then diffuse over the entire volume after taking out the wall.

As I said before, I cannot find a real-world experimental measurement to demonstrate this.

I also still don't see a convincing argument for a strictly classical derivation of the correct extensive entropy without invoking quantum-mechanical arguments (absolute phase-space element scale being ##h=2 \pi \hbar## and indistinguishability of identical particles).
 
  • Like
Likes hutchphd
  • #158
vanhees71 said:
Of course, if you have a gas consisting of two kinds of non-identical particles in a large volume you don't separate them by simply putting in a divider adiabatically. You'd just have two parts of the mixed gas in thermal equilibrium and no entropy change in putting in the divider (that's a tautology, because adiabatic means it's without change of entropy). To separate the non-identical particles in two distinct parts of the volume you need to do work and the entropy of the gas sorted into the two parts is lowered. Of course, you need to increase the entropy elsewhere, and the answer to that puzzle is due to Landauer and Szilard using the information-theoretical approach to entropy in connection to the famous "Maxwell demon". Here you have a demon sorting non-identical particles, while in the original Maxwell-demon setup the demon sorts particles by energy. The principle is of course the same.

That this information-theoretical approach is the correct one has been empirically shown recently also in the context with quantum statistics. The quantum Maxwell demon precisely worked as predicted using the information-theoretical approach.

Although I believed in a Landauer type Maxwell demon explanation of the absence of mixing entropy before, and even expressed this earlier in this thread, I don't think any more this is relevant to explain mixing entropy of distinguishable particles. The example of mixing of DNA showed, that you can have a fishing rod for distinguishable particles and use it to make the mixing reversible. While the fishing rod contains information on the particle, it can be re-used so that a thermodynamic cycle can be run in principle arbitrarily often. Furthermore, the generation of the fishing rod and the DNA may itself take place in a random fashion, so that one does not necessarily have to record the identity of every single particle. It also puts a headlight onto the question of extensivity. With a finite size label, you cannot label an infinite number of particles. This also holds true for isotopic labelling. Obviously, there is only a finite number of ways to isotopically label a molecule, though this number may be huge.
 
  • Like
Likes vanhees71
  • #159
vanhees71 said:
But there must be mixing entropy for non-identical gases which were separated by the divider wall and then diffuse over the entire volume after taking out the wall.

As I said before, I cannot find a real-world experimental measurement to demonstrate this.

I also still don't see a convincing argument for a strictly classical derivation of the correct extensive entropy without invoking quantum-mechanical arguments (absolute phase-space element scale being ##h=2 \pi \hbar## and indistinguishability of identical particles).

The question of classic versus quantum becomes irrelevant considering the setup proposed by HPt, I think.

In the thougt experiment he proposed, with a large number of distinguishable quantum particles, there is no mixing entropy. Would you agree with that?
 
  • #160
autoUFC said:
In the thougt experiment he proposed, with a large number of distinguishable quantum particles, there is no mixing entropy. Would you agree with that?
Please point to a specific reference. Semantics do not provide specificity.
 
  • #161
In classical mechanics particles are not indistinguishable because, even if they look identical, they can be labeled by a set of coordinates in space and in time (that are unique for each particle!). In this way you can distinguish between "identical" particles by "following their path instant by instant". So if you embrace a fully classical point of view, particles can always be distinguished just because they occupy a specific position in space (time) and Gibbs paradox is not solvable (at least in my opinion) because there is no a priori reason to introduce the ##\frac 1 {N!}## factor. It is different however if you decide to disregard this difference by saying: well, even if I could distinguish particles by their position it should not really change anything because they are identical in all their properties so why don't just ignore this and divide by ##N!##? This is how you solve Gibbs paradox in a classical mechanics point of view, but to me, it is not rigorous. Only QM tells you a priori that particles are indistinguishable. I really don't see a way out: if you somehow manage to introduce that ##\frac 1 {N!}## you are also implicitly negating the possibility of labeling particles by their space (time) coordinates.
 
  • #162
autoUFC said:
The question of classic versus quantum becomes irrelevant considering the setup proposed by HPt, I think.

In the thougt experiment he proposed, with a large number of distinguishable quantum particles, there is no mixing entropy. Would you agree with that?
No, why? Whenever I put first distinguishable particles in separated parts of a box and then take (adiabatically) out the dividing walls, I get an irreversible diffusion mixing all the particles in the whole volume, no matter how many sorts of particles I use for the gedanken experiment.
 
  • Like
Likes hutchphd
  • #163
hutchphd said:
Please point to a specific reference. Semantics do not provide specificity.
HPt said:
Hi, I'm the author of the paper Eur. J. Phys. 35 (2014) 015023 (also freely available at arXiv). As I explicity demonstrate in my paper, there is a paradox that manifests as a contradiction to the second law of thermodynamics. This contradiction does not only arise for classical distinguishable particles, but, as demonstrated in my paper, also within the quantum realm. The reason for this is that also quantum particles may be distinguishable: As an example, think of a gas of buckyball molecules where each buckyball is made up of two disinct carbon isotopes in a way that no two buckyballs are identical. Therefore, the claim that the Gibbs Paradox exposes an inconsistency of classical statistical mechanics or that the Gibbs Paradox is resolved by (the indistinguishability of identical particles in) QM is false.
 
  • #164
vanhees71 said:
No, why? Whenever I put first distinguishable particles in separated parts of a box and then take (adiabatically) out the dividing walls, I get an irreversible diffusion mixing all the particles in the whole volume, no matter how many sorts of particles I use for the gedanken experiment.

I was under the impression that we had agreed in this point.
Would you point out what in HPt's paper is in fault? Could you tell us what mistake has he made that lead him to reach the his conclusion ( that you disagree with) that there will be no increase in entropy?
 
  • #165
Well, already the statement in the abstract that mixing entropy for the Gibbs paradox with non-identical gases was false, is wrong. In fact the statement is right, and you can qualitatively understand it intuitively for ideal gases from the fact that the initial state (two non-identical gases in a paper separated by a wall, one gas in each part) and the final state (wall taken out the gas completely mixed over the entire volume) are macroscopically distinct from each other and that the mixing is irreversible, i.e., getting to the new equilibrium state after the wall is taken out, i.e., the diffusion of the gases within the full volume, is an irreversible process. You cannot by simply putting in adiabatically a dividing wall sort the distinct types of gas molecules in their part but you'd need a "demon", similar to Maxwell's demon, for that.
 
  • #166
Motore said:
Just to be clear: the correct mixing entropy is not only extensive entropy but you also have to include additive entropy right? Which means saying mixing entropy is evidence of non-extensivitiy, is correct.

Friends,
I will be way from the thread for a while. Personal things will keep me busy. I apologize if I do not answer immediately to any question regarding one of my posts.

That was my first time posting in PF, and has been interesting. I confess that I was frustrated at times. But now I have to say that I may have been unfair. The discussion has provided lots of fruits for thought.

At this moment I wish to state a few things that I am still convinced, even after reading some of you arguing on the contrary.

1) there is no paradox in question of mixing classical particles of the same kind. Swendsen has an argument that is as simple as it gets (see my first post for an specific reference).

The same argument (or something similar) was employed by HPt in his setup of distinguishable quantum particles.

2) Entropy of mixing is no violation of extensivity.

I stand by this statement only in the classical model.

I initialy suggested that this idea was confusion. However the argument that you can not place the particles in a given position after removing the barrier has a lot of power in the quantum setting. Quantum field theory is out of confort zone, but I am aware that quantum theory has some strange effects connected to boundary conditions (Casimir effect comes to mind). BRB
 
  • #168
autoUFC said:
you said that exchange of heat is not a violation of extensivity

No, I said, following Jaynes, that in cases like exchange of heat, where a macroscopic variable is changing, it does not even make sense to talk about extensivity of entropy.
 
  • #169
autoUFC said:
he argues that increase in entropy due to mixing depends on having the capability of segregating the mixed gases.
That gets us to the uncomfortable situation where the entropy of a system depends on our current capabilities.

Yes, exactly. That is Jaynes' point in section 5 of his paper. (He makes similar arguments in his book Probability: The Logic of Science.)

autoUFC said:
entropy in a mixing process will increase as long as there is a simple way to describe the initial state.

Only if you know about the simple way.

For example, consider two experimenters: experimenter A does not know the difference between gas A and gas B, while experimenter B does. Experimenter A will assign zero entropy change to the process of mixing gas A and gas B, because he doesn't know there are two gases present; he thinks he is just mixing two containers of the same gas. (Note that it does not matter whether he considers individual particles of the gas to be distinguishable or not.)

Experimenter B, who knows there are two gases, can use that information to extract useful work from the mixing process. And to experimenter A, it will look like experimenter B can violate the second law: he is extracting useful work from a process that is reversible (no entropy change), without any other external input or output. This is the kind of thing Jaynes is describing in Section 5 of his paper.

Of course you could say that, once we tell experimenter A about the difference between gas A and gas B, he will realize that he was simply using an incomplete model, and that experimenter B's model is the correct one. But scientific models are always tentative; it is always possible that we could make further discoveries in the future that would force us to change our model. For example, experimenter C could come along tomorrow, knowing that there are actually two subtypes of gas A and gas B, and could extract useful work from what looks to experimenter B like an even mixture of the two gases. We can't rule out such a possibility, so we have to treat even our best current knowledge about how to assign entropy to various systems as tentative, a reflection of that best current knowledge, not an objective property of the system independent of our knowledge.

autoUFC said:
Jaynes interpretation is based on the experimenter capacity. The interpretation I am most comfortable is based on the experimenter information.

There's no difference; what Jaynes means by "capacity" is the same thing as what you mean by "information". Jaynes is simply focusing on how the information is used to extract useful work from a process. If the information can't be used to extract useful work, it makes no difference to the entropy. That is the correct way of capturing what you mean by "a simple way to describe the state".
 
  • #170
Then I don't know what you mean by extensivity. Where does Jaynes state this? In his paper "The Gibbs Paradox", which we discussed above, he agrees with what I say, i.e., that there is mixing entropy and extractable work if the two non-identical gases are well-separated first and then mixed. He uses pistons for his arguments, but that's equivalent to the argument with the separating wall. If you cannot of course distinguish the gases you cannot put them separate into the two parts of the volume and then you cannot extract the work because there's then no entropy difference (p. 8 of the paper). Of course, I also fully agree with the information-theoretical meaning of entropy, because there are real-world experiments just demonstrating this theoretical argument for "quantum Maxwell demons", as established in the classical paper by Szilard (and also one by Landauer) but that's another (though related) story.

Let's argue purely within phenomenological thermodynamics. Then you have for an ideal gas is
$$\mathrm{d} S=\frac{1}{T} (\mathrm{d} U + p \mathrm{d} V-\mu \mathrm{d} N).$$
So the natural independent variables for the entropy are the extensive variables ##U##, ##V##, and ##N##. Then extensivity means that entropy is a homogeneous function of degree 1, i.e.,
$$S(\alpha U,\alpha V,\alpha N)=\alpha S(U,V,N).$$
For an ideal gas the Sackur-Tetrode formula is extensive in this sense (see the formulae in my posting #122). Note that in this formula the thermal wavelength ##\lambda## is intensive. In the here used independent variables its definition is
$$\lambda=\sqrt{\frac{2 \pi \hbar^2}{m k T}}=\sqrt{\frac{3 \pi \hbar^2 N}{mU}}.$$

Is Jaynes saying that this forumula is wrong? If yes, where? That would be confusing, because it's the result of the classical limit of the Bose-Einstein or Fermi-Dirac results for the entropy of ideal gases, which I thought is undisputed in the literature to be correct.
 
Last edited:
  • #171
vanhees71 said:
No, why? Whenever I put first distinguishable particles in separated parts of a box and then take (adiabatically) out the dividing walls, I get an irreversible diffusion
I believe your misconception is that you think of putting one kind of gas of distinguishable particles in one part and another kind of gas of distinguishable particles in the other part (such that you have knowledge of what particle is in which part). But that's not the setup. Instead think of putting a wall in a box containing pairwise distinguishable particles. Now, if you take out the dividing wall again, I hope you will agree that you don't get an irreversible diffusion.
vanhees71 said:
Well, already the statement in the abstract that mixing entropy for the Gibbs paradox with non-identical gases was false, is wrong.
This statement only echos how the Gibbs paradox is introduced in most textbooks (e.g. Huang). It doesn't contain any original insight. I invite you to read a little bit beyond the first sentence of the abstract.
 
  • #172
Of course, if the gases were already mixed before the wall is taken out, then there's no mixing energy. The mixing entropy is defined by the entropy gain due to the irreversible process of diffusion when taking out the wall when gas ##A## was in one part and gas ##B## in the other part. That's a clearly different state than when you have a mixture of both gases in each part. Then of course taking out the walls doesn't change anything and no entropy change would occur.

In your example in the beginning of the paper with the ##\text{C}_{60}## consisting of 30 ##^{12}\text{C}## and 30 ##^{13} \text{C}## isotopes in the beginning you would have to put distinguishable mixings of isomers in one part and the other. Then taking out the wall you'd get mixing entropy. If, however, you cannot distinguish the isomers from the very beginning you get indistinguishable mixings of all isomers in both parts from the very beginning, and you'd of course not get any increase in entropy when taking out the wall.
 
  • #173
If I may be so presumptuous as to summarize this (very interesting and enlightening) discussion. The world seems to run on QM. In that framework the term "distinguishable identical particle" is a meaningless phrase. Of course Josiah Gibbs and company could not know that.

I think the rest is semantics and not so interesting.
 
Last edited:
  • Like
Likes vanhees71
  • #174
vanhees71 said:
Then of course taking out the walls doesn't change anything and no entropy change would occur.
Exactly. However, if you calculate the entropy change with statistical mechanics in the conventional way (see most textbooks, such as Huang, or section 2 of my paper) you do get a non-zero entropy change. This is the "false increase in entropy" I'm referring to in the abstract of my paper.
 
Last edited:
  • Like
Likes vanhees71
  • #175
Ok, then we completely agree. I never thought that this is discussed as a "Gibbs paradox".
 

Similar threads

Back
Top