Gibbs paradox: an urban legend in statistical physics

In summary, the conversation discusses the misconception surrounding the mixing of distinguishable particles and the existence of a paradox in classical statistical physics. The speaker discovered that there is no real paradox, and this is supported by various references and papers. The correction factor of 1/N! in the entropy calculation is not a paradox, but a logical necessity. The factor is imposed by logic, not to obtain an extensive entropy, but due to combinatorial logic. This is different from Gibbs' definition of entropy in terms of volume of phase space. Within classical mechanics, there is no logical reason to include this factor, but in quantum theory it is a result of indistinguishability. This correction factor persists in the classical limit and is a natural result of Bolt
  • #106
autoUFC said:
I believe my argument addresses the case where N changes, as it deals with two systems that exchange particles.

It's not enough for two systems to exchange particles. The number of particles assigned to a system has to change. If you have two halves of a container of gas, each half containing ##N## particles, with no barrier between them, the two systems (two halves of the container) can exchange particles, but ##N## is still not changing; you still have ##N## particles in each half.

For ##N## to change, you would have to have a barrier between the halves and introduce some kind of process, like an osmotic pressure gradient (with the barrier a semi-permeable membrane), that would move particles one way across the barrier but not the other. And then you would have to add a chemical potential term to your equations, as Jaynes describes.
 
  • Like
Likes vanhees71
Physics news on Phys.org
  • #107
PeterDonis said:
It's not enough for two systems to exchange particles. The number of particles assigned to a system has to change. If you have two halves of a container of gas, each half containing ##N## particles, with no barrier between them, the two systems (two halves of the container) can exchange particles, but ##N## is still not changing; you still have ##N## particles in each half.

For ##N## to change, you would have to have a barrier between the halves and introduce some kind of process, like an osmotic pressure gradient (with the barrier a semi-permeable membrane), that would move particles one way across the barrier but not the other. And then you would have to add a chemical potential term to your equations, as Jaynes describes.

Why?

In any book of termodinamics or statistical mechanics one sees several exemples of termodinamic processes of isolated systems that are composed of subsystems. In fact, all those statements such as the second law of termodinamics are stated in regard to isolated systems.

In any way, how a semi-permeable menbrane or osmotic gradient would change the number of particles? In an isolated system N keeps constant no matter what. Unless one considers something like chemical reactions.It seens to me that these are unessessary complications, since the question in hand, the inclusion of ##1/N!## is explained by a simple isolated composite system, as Swendsen and others have demonstrated.

Can you be more clear about what is your objecton?
 
  • #108
autoUFC said:
how a semi-permeable menbrane or osmotic gradient would change the number of particles?

Because particles would be able to pass through the membrane one way, but not the other, so the number of particles in both halves would change each time a particle passed through the membrane.

Other cases involving variation of ##N## include phase changes (Jaynes mentions Gibbs' analysis of vapor pressures) and chemical reactions (which is where the term "chemical potential" as a name for the coefficient of the ##dN## term in the entropy originally came from).
 
  • #109
autoUFC said:
Can you be more clear about what is your objecton?

I'm not sure what "objection" you are talking about. In the quote of mine that you referenced in your post #105, I wasn't even responding to you.

If you intend your argument as a general proof that entropy must be extensive in classical mechanics with distinguishable particles, then your argument must have a flaw somewhere, since entropy is not always extensive in classical mechanics with distinguishable particles (Jaynes in his paper gives examples of cases where it isn't).
 
  • #110
PeterDonis said:
I'm not sure what "objection" you are talking about. In the quote of mine that you referenced in your post #105, I wasn't even responding to you.

If you intend your argument as a general proof that entropy must be extensive in classical mechanics with distinguishable particles, then your argument must have a flaw somewhere, since entropy is not always extensive in classical mechanics with distinguishable particles (Jaynes in his paper gives examples of cases where it isn't).
My argument is that by combinatorial logic the entropy of a system of N permutable elements is ##S=k \ln(\Omega(N)/N!)##. Identical classical particles are an exemple of permutable elements as one assumes that swapping two of then counts as a different state.

Extensivity is a consequence of this. The only situation I can see where extensivity would not hold is in the case where statistical independence does not hold. To be clear, I would say that two systems are statistically independent if the number of accessible states for a system do not depend on the microstate the other system is. This is an usual requirement for a consistent extensive entropy. Stars in a galaxy is an example of a system where statistical independence does not hold, due to the fact that gravitational interactions are of long range. Can you say if the exemples Jaynes mention are of this kind where there is no independence?

Maybe you are considering that the entropy of mixing of two different kinds of gases would be an exemple of non-extensivity? That is not the case. Entropy of mixing is not a violation of extensivity.
 
  • #111
autoUFC said:
Entropy of mixing is not a violation of extensivity.

Certainly it is. You have two systems, with ##N_1## and ##N_2## particles and entropies ##S_1## and ##S_2##. You mix the two systems together to form a total system with ##N = N_1 + N_2## particles, but the new system's entropy is not ##S_1 + S_2##; it is ##S = S_1 + S_2 + S_\text{mixing}##.
 
  • Like
Likes etotheipi and vanhees71
  • #112
PeterDonis said:
Certainly it is. You have two systems, with ##N_1## and ##N_2## particles and entropies ##S_1## and ##S_2##. You mix the two systems together to form a total system with ##N = N_1 + N_2## particles, but the new system's entropy is not ##S_1 + S_2##; it is ##S = S_1 + S_2 + S_\text{mixing}##.
Ok. That does not make entropy not extensive. Before you mix, entropy is ##S = S_1 + S_2##. After you mix ##S = S_1 + S_2 + S_\text{mixing}= S'_1 + S'_2##, where ##S'_1## and ##S'_2## are the entropy of each subsystem after mixing, that is larger because mixing is a process out of equilibrium. However, before and after mixing the total entropy is the sum of the entropies of each subsystem, because mixing certainly is not the same as non-extensivity.
 
  • #113
Of course, the Gibbs paradox only occurs when the particles are indistinguishable, and there are two separate issues to be solved:

(a) The nonextensivity of entropy when using the classical counting, where all identical particles have to be considered as distinguishable. Boltzmann and Gibbs solved this problem, contradicting phenomenological entropy, by introducing the factor ##1/N!##, leading to the Sackur-Tetrode formula in the case of ideal gases, which is, within the classical realm (i.e., where Bose-Einstein or Fermi-Dirac quantum features are negligible, i.e., the gas is "non-degenerated").

(b) Having accepted this adaption of the counting of states by borrowing quantum-indistinguishability of identical particles for classical statistics. There is still mixing entropy, and it's well justified if you have non-identical ideal gases, first separated in two partial volumes but at the same pressure and temperature, i.e., the numbers of particles fulfill ##N_1/N_2=V_1/V_2## and then adiabatically taking out the dividing wall and let the two gases diffuse into each other and mix such that you are in equilibrium of the mixing at the same temperature and pressure, the entropy increases, and that's the mixing entropy.

The apparent paradox is that the only thing you need to assume is that the gas molecules are not identical, and this can be an apparently small difference (like different isotopes of the same atoms or even the same atoms in different intrinsic states like ortho- and para-helium). The point, however, is that the particles of the two gases are in some way distinguishable, and then you have to count such that you get the mixing entropy, which is always the same, no matter how small the distinguishing feature of the two sorts of gases might be: ##S_{\text{mix}}=k [(N_1+N_2) \ln(N_1+N_2)-N_1 \ln(N_1)-N_2 \ln N_2]=k[N_1 \ln (N/N_1) + N_2 \ln(N/N_2)]>0##. At the moment, where you have identical gases the mixing entropy must vanish.

Of course, once having accepted the indistinguishability of identical particles in the counting of microstates, borrowed from quantum theory and applied in otherwise classical statistics, and accepting the information-theoretical meaning of entropy, there's no more Gibbs paradox, because if you distribute identical particles to (equilibrium) microstates it doesn't matter whether you keep the dividing wall or not when counting the microstates given the equilibrium conditions (same temperature and pressure implies for ideal gases simply ##N_1/N_2=V_1/V_2##, and you just through indistinguishable particles into the entire volume ##V_1+V_2##, no matter whether there is the divider in place or not). There's no measurable difference about the gases in the one or the other part of the total volume whether there's the divider or not and thus there's no increase of entropy when the gases diffuse after adiabatically taking out the wall.

As Pauli rightly says in his book: There's no smooth transition between the case with non-identical and identical gases in the two partial volumes and thus there's no paradox in having the same finite mixing entropy for non-identical gases vs. zero mixing energy of identical particles in the setup of the Gibbs paradox.

But that is, of course, also a generic quantum feature, i.e., (a) identical particles are really indistinguishable: In contradistinction to macroscopic classical "particles" quantum particles are really indistinguishable. E.g., each electron has precisely the same intrinsic quantum number without the slightest deviation, i.e., the same mass, spin 1/2, electric charge, weak isospin, and baryon number and (b) there's no way to follow individual identical particles if not strictly separated by spatial constraints and thus the full Hamiltonian of the many-body system commutes with all permutation operators for identical particles. Together with some topological arguments (C. deWitt Morette et al) this implies that for identical quantum particles the many-body states are either totally symmetric (bosons) or anti-symmetric (fermions). Within local relativistic QFT (particularly from the microcausality condition and the realization of the proper orthochronous Poincare group by local unitary transformations of the corresponding field operators) it also follows the relationship between "spin and statistics", i.e., half-integer spin particles must be necessarily fermions and integer-spin particles must be necessarily bosons.

M. G. G. Laidlaw and C. M. DeWitt, Feynman Functional
Integrals for Systems of Indistinguishable Particles, Phys.
Rev. D 3, 1375 (1970),
https://link.aps.org/abstract/PRD/v3/i6/p1375

S. Weinberg, The Quantum Theory of Fields, vol. 1,
Cambridge University Press (1995). All these profound findings are not understandable within classical (statistical) physics!
 
  • Like
Likes hutchphd
  • #114
vanhees71 said:
The point, however, is that the particles of the two gases are in some way distinguishable, and then you have to count such that you get the mixing entropy, which is always the same, no matter how small the distinguishing feature of the two sorts of gases might be: ##S_{\text{mix}}=k [(N_1+N_2) \ln(N_1+N_2)-N_1 \ln(N_1)-N_2 \ln N_2]=k[N_1 \ln (N/N_1) + N_2 \ln(N/N_2)]>0##. At the moment, where you have identical gases the mixing entropy must vanish.

That is not totally precise. You are right that distinguishable particles is a necessary condition for entropy of mixing. However, it is not a sufficient condition.
HPt wrote in his post that

"[There is mixing entropy only] if you know which particle is in which partial volume."

So you have this as the suficient condition, you need to have some SIMPLE way to know what particle is in what subsystem in the beginning to have entropy of mixing.

For instance,in the case of buckballs, one may start with the buckballs with higher molecular mass in one subsystem and the one with lower molecular mass in the other subsystem. In this case you get an increase in entropy by mixing. If you start alread in a scrambled state, and can not determine what buckball is in what subsystem, you do not have an increase in entropy due to mixing.
((Not relevant to the point I am trying to convey here, but I should mention that I am only now apreciating HPt's point that buckballs are small enough to be treated as quantum particles. ))

The point of entropy of mixing being dependent on knowing where each distinguishble element is, is also a feature in my example of the macroscopic balls with a number written on them. There will be entropy increase due to mixing if there is a SIMPLE way to know where each ball is. For instance, if the balls with even numbers are in one subsystem and the ones with odd numbers are in the other subsystem. In this case there is entropy of mixing.

A currious thing is that you may have complete list with all the particles that determines for each one in what subsystem it was in the beginning. You can know precisely the starting point of each particle, but not in a SIMPLE way. Considering that the inital state is an equilibrium state, with then right amount of particles in each subsystem, the entropy of the list of intial conditions is the same as the entropy of the system after mixing. In this case there is no entropy of mixing.

I guess that this could be regarded as a paradox. In my understanding this is a good illustration of the connection between entropy in physics and entropy in information theory.
 
  • #115
autoUFC said:
That is not totally precise. You are right that distinguishable particles is a necessary condition for entropy of mixing. However, it is not a sufficient condition.
HPt wrote in his post that

"[There is mixing entropy only] if you know which particle is in which partial volume."
Of course, if the gases are mixed in both parts of the volume in equilibrium conditions, then of course also nothing changes when taking out the wall. In the Gibbs paradox it's discussed what happens when the non-identical gases are separated in the to parts of the volume and then the dividing wall is adiabatically taken out.
 
  • #116
vanhees71 said:
Of course, if the gases are mixed in both parts of the volume in equilibrium conditions, then of course also nothing changes when taking out the wall. In the Gibbs paradox it's discussed what happens when the non-identical gases are separated in the to parts of the volume and then the dividing wall is adiabatically taken out.
If the particles are distinguishable, as in the cases of the many buckballs with distinct isotopes, the gases are non-identical. But when you remove the partition there is no increase in entropy. Entropy of mixing depends on you having a SIMPLE way to determine what is the initial condition. So, if no such way exists, removing the partition between the two systems of a macroscopicaly large number of distinguishable particles do not increase entropy.
 
  • #117
Exactly what does SIMPLE mean?
How identical do the particles need to be?
Quantum Mechanics gives you an unequivocal answer: same quantum numbers. Otherwise you need to wave your hands which I guess is why this is #117
 
  • Like
Likes vanhees71
  • #118
autoUFC said:
before and after mixing the total entropy is the sum of the entropies of each subsystem

No, it isn't after mixing. There is no way to assign ##S_\text{mixing}## to either subsystem individually; you can only assign it to the whole system. So there is no way to express the total entropy after mixing as the sum of subsystem entropies.

This is an example of the point Jaynes makes in his paper, that when you have multiple systems interacting with each other, the only meaningful entropy is the entropy of the whole system that contains all of them.

autoUFC said:
mixing certainly is not the same as non-extensivity

You're contradicting yourself; you just said mixing does make entropy non-extensive, but now you're denying it.

I think you have not fully thought through this subject.
 
  • Like
Likes vanhees71
  • #121
PeterDonis said:
No, it isn't after mixing. There is no way to assign ##S_\text{mixing}## to either subsystem individually; you can only assign it to the whole system. So there is no way to express the total entropy after mixing as the sum of subsystem entropies.

Supose you have a partition of a large number of pairwise distinguishable particles. That is, any two particles will be distinguishable. These particles may be quantum particles, as in the buckballs with diferent isotopes proposeded by HPt. One can place these distinguishable particles in a chamber with a barrier that divides the chamber in two. This barrier may be open and closed adiabatically.

If one waits the system reach equilibrium with the barrier open, and then closeses the barrier, the system is separated into two subsystems. The sum of the entropies of each subsystem should be the same as before placing the barrier, or there will be second law violation.
For those that are confused about this point, that is extensivity.

If then you removes the barrier, there should be no entropy increase. Notice that is the mixing of distinguishable particles (pairwise distinguishable) , but there is no entropy increase due to mixing.

You may even place the barrier back. Again the entropy of the subsystems would sum to be the same as before placing the barrier, and that would be a "way to express the total entropy after mixing as the sum of subsystem entropies."

You may think of another thought experiment. Placing initialy the buckballs with larger molecular mass in one subsystem, and those with lower mass on the other. In this cases there will be an increase in entropy due to mixing. To use this setup to provide work, one should need a semi-permeable membrane, that could separate buckballs by molecular mass.

The difference between these two cases is that in the later you have a simple way to determine in what subsystem each buckball is in the beginning.

One may have a complete list of each different buckball, indicating where each particle is before mixing. In this case also mixing should not increase entropy. There are two ways to explain this. One is with the idea of information entropy. The information entropy of the list is as large as the entropy of the system after mixing. Other way is to note that a semi-permeable membrane that selects buckballs based on the list would be a Maxwell demon. So there is no feasible way to produce work from this mixing process.

PeterDonis said:
You're contradicting yourself; you just said mixing does make entropy non-extensive, but now you're denying it.

I think you have not fully thought through this subject.

I think YOU have not fully thought through this subject.
Can you tell us what you believe non-extensivity means? You seem to be confused about this.
 
  • #122
Entropy is extensive (at least in the standard cases where there are no long-ranged interactions between the particles/atoms/molecules).

Let's discuss once more the standard Gibbs's paradox setting, now using its resolution using the indistinguishability of identical particles from quantum theory and using the full Sackur-Tetrode formula for the entropy.

Case (a): Two non-identical gases

You have a box of total volume ##V## divided by a wall into two partial volumeds ##V_1## and ##V_2## (implying ##V=V_1+V_2##). Now you fill part ##V_1## with He gas and ##V_2## with Ar gas. We treat the gases as ideal gases (for simplicity) and assume global thermal equilibrium, which implies that ##p## and ##T## are the same in both parts and thus due to ##p V=N k T##, you have ##N_1/V_1=N_2/V_2##.

[EDIT: I corrected the formula the first term in the bracket in the following equation must be ##5/2## not ##3/2##; I corrected this typo in the subsequent formulae too; the final conclusion is unchanged, because in the mixing entropy this term cancels anyway.]

The entropy for a (monatomic) ideal gas (expressed in terms of ##N##, ##V##, and ##T##) is
$$S=k N \left [\frac{5}{2} + \ln \left (\frac{V}{N \lambda^3} \right) \right],$$
where ##\lambda=\sqrt{\frac{2 \pi \hbar^2}{m k T}}## is the thermal de Broglie wavelength.

Since the entropy is extensive for our case we have
$$S_{\text{before mixing}}=S_1+S_2=k N_1 \left [\frac{5}{2} + \ln \left (\frac{V_1}{N_1 \lambda^3} \right) \right] + k N_2 \left [\frac{5}{2} + \ln \left (\frac{V_2}{N_2 \lambda^3} \right) \right].$$
Now take adiabatically out the dividing wall and wait until the two gases have diffused completely into each other. Then each gas occupies the entire volume, and you get
$$S_{\text{after mixing}} =k N_1 \left [\frac{5}{2} + \ln \left (\frac{V}{N_1 \lambda^3} \right) \right] + k N_2 \left [\frac{5}{2} + \ln \left (\frac{V}{N_2 \lambda^3} \right) \right].$$
and the gain of entropy from this mixing is
$$S_{\text{after mixing}}-S_{\text{before mixing}}=k N_1 \ln \left (\frac{V}{V_1} \right) + k N_2 \ln \left (\frac{V}{V_2} \right)=k N_1 \ln \left (\frac{N}{N_1} \right) + k N_2 \ln \left (\frac{N}{N_2} \right) = k N \ln N -k N_1 \ln N_1-k N_2 \ln N_2>0,$$
as stated repeatedly in this thread.

Case (b): Two identical gases

Taking again into account the indistinguishability of the gas particles of the identical gases, there's no difference in the entropy between the case with or without the wall dividing the volume, because you cannot distinguish which individual atom is in either part of the volume. All you know is that you have identical gases in thermal equilibrium at the same temperature and pressure and thus nothing happens when you adiabatically take out the dividing wall and thus there's no mixing entropy.

So using the arguments from quantum theory to establish the phase-space measure ##\hbar## per independent pair of configuration and canonically conjugated momenta ##(q,p)## and the correct counting of microstates of indistinguishable particles when calculating the entropy using the Boltzmann-Planck formula ##S=k \ln \Omega##, leading to an extensive entropy, you have mixing entropy for non-identical but no mixing entropy for identical gases filled into the box initially divided by a wall, which then it taken out adiabatically.
 
Last edited:
  • Love
Likes etotheipi
  • #123
hutchphd said:
Exactly what does SIMPLE mean?

As I explained in my previous post, there are two ways to understand what simple means.
If the information entropy of the rule that segregated particles is as large as the entropy of the system, that rule is not simple.
If a semi-permeable membrane, that selects particles based on a rule is a Maxwell demon, them this rule is also not simple.

hutchphd said:
How identical do the particles need to be?
Quantum Mechanics gives you an unequivocal answer: same quantum numbers. Otherwise you need to wave your hands which I guess is why this is #117

You are right. Truly identical particles are a feature of Quantum Mechanics. However, there are systems of large number of distinguishable particles. The classical particle model is an instance of such system. To answer your question. Particles need not to be identical or even similar. They need to be distinguishable. This point was stressed by van Lith:

"It is a remarkable fact that, in Van Kampen’s construction, it is not the indistinguishability of
classical particles that gives rise to the factor 1/N!. Rather, it is the distinguishability of classical
particles!"

The Gibbs Paradox: Lessons from Thermodynamics
Janneke van Lith
Entropy 2018.

I would like to point out that if one insists that the ONLY reason to include the ##1/N!## in the entropy of classical gases is to make it extensible, or to make it agree with quantum gases in the limits of low occupation of states, one is propagating a logical error.

I alread mentioned the Edward's entropy of grains, as an instance where researchers initialy did not include the ##1/N!##. They most likely thought they should not, since grains are not identical quantum particles treated as classical. Only that there is a logical reason to include this term for systems of distinguishable elements, as already explained several times.

The cases proposed by HPt also shows the problem of rejection any other explanation for ##1/N!## besides agreeing with the predictions for gases of identical quantum particles. In HPt's thought experiment one has a quantum gas of distinguishable particles. If you believe that in this instance ##1/N!## should not be included you will find violations of the second law.

Due to the fact the logical explanation for including this term is not known by most, you see researchers reaching problematic conclusions.
 
  • #124
vanhees71 said:
Entropy is extensive (at least in the standard cases where there are no long-ranged interactions between the particles/atoms/molecules).

Agree with you here.

vanhees71 said:
So using the arguments from quantum theory to establish the phase-space measure ##\hbar## per independent pair of configuration and canonically conjugated momenta ##(q,p)## and the correct counting of microstates of indistinguishable particles when calculating the entropy using the Boltzmann-Planck formula ##S=k \ln \Omega##, leading to an extensive entropy, you have mixing entropy for non-identical but no mixing entropy for identical gases filled into the box initially divided by a wall, which then it taken out adiabatically.

Agree with you here.

Could you address the case proposed by HPt?
He proposed a quantum gas composed of a large number of quantum particles. Only that if you take any two of them you see that they are distinguishable. Supose that this system is in equilibrium when you place a barrier splitting this system in two. How would you tread this case? My understanding is that this case is not the case of two identical gasses. Nor the case of two non-identical gases, since in your description the particles of each gas are distinguishable when you compare the particles in one gas with the ones in the one in the other gas, but identical when you compare particles in the same gas. In HPt's casa particles are always distinguishable. What is your understanding of this case? Would you agree that placing this barrier does not result in a reducion of the entropy?
 
Last edited:
  • #125
HPt said:
Hi, I'm the author of the paper Eur. J. Phys. 35 (2014) 015023 (also freely available at arXiv). As I explicity demonstrate in my paper, there is a paradox that manifests as a contradiction to the second law of thermodynamics. This contradiction does not only arise for classical distinguishable particles, but, as demonstrated in my paper, also within the quantum realm. The reason for this is that also quantum particles may be distinguishable: As an example, think of a gas of buckyball molecules where each buckyball is made up of two disinct carbon isotopes in a way that no two buckyballs are identical. Therefore, the claim that the Gibbs Paradox exposes an inconsistency of classical statistical mechanics or that the Gibbs Paradox is resolved by (the indistinguishability of identical particles in) QM is false.

Am I missing something? In your paper you cleverly devises a way to have a gas of quantum pairwise different particles. That demonstrates that the problem of a gas of distinguishable particles exists also in quantum mechanics. I supose that's why you say that "the claim that the Gibbs Paradox exposes an inconsistency of classical statistical mechanics or that the Gibbs Paradox is resolved by (the indistinguishability of identical particles in) QM is false."

However, I do not understand why you say that there is a paradox. Wouldn't you say that equations 24 and 31 of your paper show that there is no paradox? As you said in other post, there is no entropy of mixing for the model of a gas of classical identical particles. Also you show that there is no entropy of mixing in the gas of pairwise distinguishable quantum particles. What paradox remains?
 
  • #126
autoUFC said:
If one waits the system reach equilibrium with the barrier open, and then closeses the barrier, the system is separated into two subsystems. The sum of the entropies of each subsystem should be the same as before placing the barrier, or there will be second law violation.
For those that are confused about this point, that is extensivity.

The entropy is extensive because, even thought the individual particles are distinguishable in principle, you are not distinguishing them when you define the macrostates of the system and the subsystems. So as far as the macrostates are concerned, the particles could just as well be indistinguishable. In effect, you are treating the system as being composed of a large number of particles all of the same type. And yes, for a system of particles all of the same type (or for which you are ignoring any differences between the individual particles), the entropy is extensive. Nobody has disputed this.

autoUFC said:
If then you removes the barrier, there should be no entropy increase. Notice that is the mixing of distinguishable particles (pairwise distinguishable) , but there is no entropy increase due to mixing.

Only because, as above, you are ignoring the fact that the particles are distinguishable; you are not using any of that information to define the macrostates of the system.

autoUFC said:
Placing initialy the buckballs with larger molecular mass in one subsystem, and those with lower mass on the other. In this cases there will be an increase in entropy due to mixing.

Yes, and in this case, the entropy will not be extensive; the mixing entropy is an additional entropy of the system after mixing, over and above the entropies of the two subsystems before mixing. After mixing, it is meaningless even to ask what the entropy of each subsystem is, because the subsystems that existed before mixing no longer exist after mixing, and there is no way to recover them reversibly.

autoUFC said:
I think YOU have not fully thought through this subject.

I think you are simply refusing to read what other people are posting in this thread, and just continuing to repeat the same invalid claims even after others have refuted them. That is why I closed the thread for moderation before. If things keep going the way they are going, I am going to close it again; it is pointless to waste everyone's time going around and around in circles.

autoUFC said:
Can you tell us what you believe non-extensivity means? You seem to be confused about this.

I am not confused at all; you are either confused, or, as above, you are simply refusing to read what others are posting. I already explained, before the thread was closed for moderation, how extensivity of entropy fails when there is entropy of mixing. I just explained it again above. Read it.
 
  • Like
Likes vanhees71 and weirdoguy
  • #127
PeterDonis said:
The entropy is extensive because, even thought the individual particles are distinguishable in principle, you are not distinguishing them when you define the macrostates of the system and the subsystems. So as far as the macrostates are concerned, the particles could just as well be indistinguishable. In effect, you are treating the system as being composed of a large number of particles all of the same type. And yes, for a system of particles all of the same type (or for which you are ignoring any differences between the individual particles), the entropy is extensive. Nobody has disputed this.
Only because, as above, you are ignoring the fact that the particles are distinguishable; you are not using any of that information to define the macrostates of the system.Yes, and in this case, the entropy will not be extensive; the mixing entropy is an additional entropy of the system after mixing, over and above the entropies of the two subsystems before mixing. After mixing, it is meaningless even to ask what the entropy of each subsystem is, because the subsystems that existed before mixing no longer exist after mixing, and there is no way to recover them reversibly.
I think you are simply refusing to read what other people are posting in this thread, and just continuing to repeat the same invalid claims even after others have refuted them. That is why I closed the thread for moderation before. If things keep going the way they are going, I am going to close it again; it is pointless to waste everyone's time going around and around in circles.
I am not confused at all; you are either confused, or, as above, you are simply refusing to read what others are posting. I already explained, before the thread was closed for moderation, how extensivity of entropy fails when there is entropy of mixing. I just explained it again above. Read it.

Maybe I misunderstood what you was saying in the other posts. Reading your latest post, I agree with most of it. I agree that when you ignore the information of what particles are in what side you do not have an increase in entropy. However I would not say that does make the particles indistinguishable. But for sure, I mentioned that having some simple way to know where are each particle before mixing, is a condition for having entropy increase due to mixing. Also, having a semipermiable membrane that segregate particles is nessessary to convert this into work.

However, I still do not understand why increase entropy due to mixing would be a violation of extensivity. I explain my doubt. Supose that two subsystems that are initially in different temperatures are put into contact to exchange heat. Let us say that happens quasistaticaly. At any moment you can compute the entropy of each subsystem. When equilibrium is reached you have an increase in entropy. My understanding is that the same happens in the case of two systems with different species of gas, when you put the systems in contact and allow exchange of particles. You may put a permeable membrane to assure that the mixing process is quasistatic. You can use the membrane to define the two separated subsystems. Since the process is quasistatic, each subsystem is in equilibrium. You can determine the entropy in each side at all moments. Diferent from exchange of heat, entropy increases in both sides. However, in any moment during the quasistatic process the sum of the entropy in both subsystems is the entropy of the out of equilibrium composite system. When equilibrium is reached the same is true.
 
  • #128
autoUFC said:
I would not say that does make the particles indistinguishable.

I didn't say it made the particles indistinguishable. I said it means ignoring the differences between the particles and therefore treating them as if they were indistinguishable.
 
  • #129
autoUFC said:
I still do not understand why increase entropy due to mixing would be a violation of extensivity.

I have already explained this. Twice.

autoUFC said:
At any moment you can compute the entropy of each subsystem.

That's because the subsystems are only exchanging heat, not particles.

You could treat a case where subsystems exchange particles similarly, by adding a chemical potential term to the entropy (as I mentioned in a previous post), but this only works if the exchange of particles between the subsystems is controlled and each subsystem is separately trackable. See below.

autoUFC said:
My understanding is that the same happens in the case of two systems with different species of gas, when you put the systems in contact and allow exchange of particles.

That depends on how you do the mixing. See below.

autoUFC said:
You may put a permeable membrane to assure that the mixing process is quasistatic.

Doing the mixing this way means you retain the ability to track each subsystem separately, and you control how particles are exchanged between the subsystems. And, as above, to correctly track the entropy of each subsystem, you will need to add a chemical potential term to account for the effects of particles moving from one subsystem to the other.

This is a different case from the case of having two species of gas with a barrier between them, and then just removing the barrier and letting them mix in a single container. That was the case we were discussing in this thread up until now. For that case, as soon as you remove the barrier and allow the two species of gas to mix in a single container, you lose the ability to track the subsystems altogether. You now have just one system, whose entropy is more than the sum of the entropies of the two subsystems before mixing; and it is now meaningless to ask what the entropy of each subsystem is after mixing, because the subsystems no longer exist in any meaningful sense.
 
  • #130
PeterDonis said:
I have already explained this. Twice.
That's because the subsystems are only exchanging heat, not particles.

You could treat a case where subsystems exchange particles similarly, by adding a chemical potential term to the entropy (as I mentioned in a previous post), but this only works if the exchange of particles between the subsystems is controlled and each subsystem is separately trackable. See below.
That depends on how you do the mixing. See below.
Doing the mixing this way means you retain the ability to track each subsystem separately, and you control how particles are exchanged between the subsystems. And, as above, to correctly track the entropy of each subsystem, you will need to add a chemical potential term to account for the effects of particles moving from one subsystem to the other.

This is a different case from the case of having two species of gas with a barrier between them, and then just removing the barrier and letting them mix in a single container. That was the case we were discussing in this thread up until now. For that case, as soon as you remove the barrier and allow the two species of gas to mix in a single container, you lose the ability to track the subsystems altogether. You now have just one system, whose entropy is more than the sum of the entropies of the two subsystems before mixing; and it is now meaningless to ask what the entropy of each subsystem is after mixing, because the subsystems no longer exist in any meaningful sense.

Actually not quite sure if I see your point. Pardon me if I respond to the wrong argument. Not trying to build a strawman.

The two processes are different. Just in the process I described in my last post one can say that there is an equilibrium state in each partition. However, that still does not indicate that increase in entropy due to mixing is a violation of extensivity. I explain.

Going back to the case of systems in thermal contact. When you allow heat transfer, the temperature does not increase uniformely in the whole system (unless in the quasistatical case). So you can not define an equilibrium in each subsystem.
You can, however, partition the system in small pieces. If these pieces are small enough each piece can be be (approximately) seen as an equilibrium system in a given temperature. The entropy of the whole out of equilibrium system is the sum of the entropies of each piece.

The same hapens for particle exchange.

Since I am still not sure about what is your point, I have to respond to your argument in other way. Again, sorry if I misunderstand. There is a more sutil point, that you may be trying to convey. You may be saying that in the out of equilibrium process where particles mix, one can not say where each particle is, therefore there is no meaning in spliting the system in pieces, since you can not say what is where. In quantum mechanics that is probably the case, but I will come back to that.

In classical mechanics I do not agree. In the Hamiltonean model you can track the position of each particle. If you are dealing with the coloids of Swendson, the grains of Edwards, or if you are investigating a Hamiltonean system that you solve in your computer, you should be able to tell where each of the particles is in any moment.
It was mentioned that systems with long range forces are an exemple of non-extensivity. However, if anyone could ever devise an useful entropy for such systems as globular clusters or galaxies, I sure that it would acount for the fact that the position of each star can be determined in any time. And I am sure it would include the ##1/N!##, as this term is logicaly needed in the entropy of distinguishable elements.

Regarding the quantum mechanics case. I do not believe that the fact one can not precisely determine the positions of the particles should lead to non-extensivity. I am not confortable talking about quantum field theory, but I believe that if that was the case it would put in havok all QFT for T>0. Any thoughts on that?
 
  • #131
autoUFC said:
in the process I described in my last post one can say that there is an equilibrium state in each partition

If you mean the semi-permeable membrane process, no, you cannot say that, because you have left out a key macroscopic variable, the osmotic pressure. The reason particles move from one side of the membrane to the other is that the osmotic pressures are unequal; and while that is happening, you cannot say the osmotic pressure in either partition is uniform throughout the partition. You can only say osmotic pressure is uniform in each partition when particles have stopped moving from one side to the other and the whole system has come to equilibrium. In other words, this case works the same as the heat transfer case; the only difference is which macroscopic variable is driving the process (temperature or osmotic pressure).

Furthermore, the fact that in these processes, a macroscopic variable is changing, is the key difference between them and the mixing process that gives rise to the Gibbs paradox in the latter case. The whole point of the paradox is that no macroscopic variable changes at all during the mixing; there is no macroscopic variable, like temperature or osmotic pressure, which is driving the process by being different from one partition to the other. The only thing that changes when the barrier is removed between the two partitions in the mixing case is that mixing is now allowed--all the particles can range over the entire container, instead of some particles being confined to the left half and some being confined to the right half.

In other words, in the mixing case, you start out with ##N = N_1 + N_2## particles at temperature ##T##, pressure ##P##, etc., etc., and you end up with ##N## particles at the same temperature ##T##, pressure ##P##, etc., etc.--but you do not have the same entropy at the end as you did at the start, because of the mixing entropy. That is what makes the mixing entropy a violation of extensivity of entropy: because nothing else changed except the mixing, yet the entropy changed.

By contrast, in the other cases you describe, something else changed; some macroscopic variable changed in each partition. So you can't even evaluate "extensivity of entropy" at all without first allowing for those changes in macroscopic variables. In other words, those other cases require a different analysis from the mixing case.

autoUFC said:
You may be saying that in the out of equilibrium process where particles mix, one can not say where each particle is, therefore there is no meaning in spliting the system in pieces, since you can not say what is where.

This is one of those cases where you really, really need to be precise in your language. "One can not say where each particle is" is very vague.

The correct statement is the one I made above about the mixing case: before the barrier is removed, you have a set of ##N_1## particles confined to one half of the container, and a set of ##N_2## particles confined to the other half of the container. After the barrier is removed, you have a set of ##N## particles confined to the entire container. Whether or not you could, in principle, track the locations of each individual particle is irrelevant; the fact is that you aren't, you are defining your macrostates by the region of space that each particular set of particles is confined to. If you were tracking the locations of each individual particle, you would have a much larger and more complicated set of macrostates and you would be doing a very different analysis.

The same care needs to be taken when talking about whether or not the particles are "distinguishable". That word, by itself, is vague. In the case where there is entropy of mixing, the relevant criterion is not whether we can, in principle, distinguish each individual particle from every other; the relevant criterion is what attributes of the particles we are distinguishing. In the simple scenario usually used to discuss the Gibbs paradox, it is assumed that we have two "types" of particles, which really means that we have just two possible values of some parameter that we are using to identify the particles. For example, if we are mixing two different gases, A and B, the parameter is "what kind of gas is it", and the two possible values are A and B. Whether we are using classical underlying physics, in which each individual particle of gas A could in principle be distinguished from every other particle of gas A, or whether we are using quantum underlying physics, where (if we leave out internal excited states) every particle of gas A is indistinguishable from every other, is irrelevant to how we are identifying the macroscopic states we are keeping track of. The only macroscopic states we are keeping track of are "particles of gas A" and "particles of gas B".

So the reason there is mixing entropy in this case is simply that we start with the macroscopic state "particles of gas A confined to one side of the container, and particles of gas B confined to the other side of the container", and we end up with the macroscopic state "particles of gas A and gas B confined to the entire container". And it is not possible to reversibly take the latter macroscopic state back to the former macroscopic state, because that would require expending energy (and entropy) to pick out the particles of gas A and confine them to one side of the container, and pick out particles of gas B and confine them to the other side of the container. Whereas, if all the particles are of gas A to start with, the two macroscopic states can be reversibly converted between each other just by removing or replacing the barrier.

All of this is clearly explained in the Jaynes paper.

autoUFC said:
It was mentioned that systems with long range forces are an exemple of non-extensivity.

Jaynes mentions others: vapor pressures, for example.

Jaynes doesn't just cherry pick specific examples, however. He gives a general discussion of when we should expect entropy to be extensive and when we should expect it not to be. I suggest re-reading his paper with that in mind.
 
  • #132
PeterDonis said:
If you mean the semi-permeable membrane process, no, you cannot say that, because you have left out a key macroscopic variable, the osmotic pressure. The reason particles move from one side of the membrane to the other is that the osmotic pressures are unequal; and while that is happening, you cannot say the osmotic pressure in either partition is uniform throughout the partition. You can only say osmotic pressure is uniform in each partition when particles have stopped moving from one side to the other and the whole system has come to equilibrium. In other words, this case works the same as the heat transfer case; the only difference is which macroscopic variable is driving the process (temperature or osmotic pressure).

Furthermore, the fact that in these processes, a macroscopic variable is changing, is the key difference between them and the mixing process that gives rise to the Gibbs paradox in the latter case. The whole point of the paradox is that no macroscopic variable changes at all during the mixing; there is no macroscopic variable, like temperature or osmotic pressure, which is driving the process by being different from one partition to the other. The only thing that changes when the barrier is removed between the two partitions in the mixing case is that mixing is now allowed--all the particles can range over the entire container, instead of some particles being confined to the left half and some being confined to the right half.

In other words, in the mixing case, you start out with ##N = N_1 + N_2## particles at temperature ##T##, pressure ##P##, etc., etc., and you end up with ##N## particles at the same temperature ##T##, pressure ##P##, etc., etc.--but you do not have the same entropy at the end as you did at the start, because of the mixing entropy. That is what makes the mixing entropy a violation of extensivity of entropy: because nothing else changed except the mixing, yet the entropy changed.

By contrast, in the other cases you describe, something else changed; some macroscopic variable changed in each partition. So you can't even evaluate "extensivity of entropy" at all without first allowing for those changes in macroscopic variables. In other words, those other cases require a different analysis from the mixing case.
This is one of those cases where you really, really need to be precise in your language. "One can not say where each particle is" is very vague.

The correct statement is the one I made above about the mixing case: before the barrier is removed, you have a set of ##N_1## particles confined to one half of the container, and a set of ##N_2## particles confined to the other half of the container. After the barrier is removed, you have a set of ##N## particles confined to the entire container. Whether or not you could, in principle, track the locations of each individual particle is irrelevant; the fact is that you aren't, you are defining your macrostates by the region of space that each particular set of particles is confined to. If you were tracking the locations of each individual particle, you would have a much larger and more complicated set of macrostates and you would be doing a very different analysis.

The same care needs to be taken when talking about whether or not the particles are "distinguishable". That word, by itself, is vague. In the case where there is entropy of mixing, the relevant criterion is not whether we can, in principle, distinguish each individual particle from every other; the relevant criterion is what attributes of the particles we are distinguishing. In the simple scenario usually used to discuss the Gibbs paradox, it is assumed that we have two "types" of particles, which really means that we have just two possible values of some parameter that we are using to identify the particles. For example, if we are mixing two different gases, A and B, the parameter is "what kind of gas is it", and the two possible values are A and B. Whether we are using classical underlying physics, in which each individual particle of gas A could in principle be distinguished from every other particle of gas A, or whether we are using quantum underlying physics, where (if we leave out internal excited states) every particle of gas A is indistinguishable from every other, is irrelevant to how we are identifying the macroscopic states we are keeping track of. The only macroscopic states we are keeping track of are "particles of gas A" and "particles of gas B".

So the reason there is mixing entropy in this case is simply that we start with the macroscopic state "particles of gas A confined to one side of the container, and particles of gas B confined to the other side of the container", and we end up with the macroscopic state "particles of gas A and gas B confined to the entire container". And it is not possible to reversibly take the latter macroscopic state back to the former macroscopic state, because that would require expending energy (and entropy) to pick out the particles of gas A and confine them to one side of the container, and pick out particles of gas B and confine them to the other side of the container. Whereas, if all the particles are of gas A to start with, the two macroscopic states can be reversibly converted between each other just by removing or replacing the barrier.

All of this is clearly explained in the Jaynes paper.
Jaynes mentions others: vapor pressures, for example.

Jaynes doesn't just cherry pick specific examples, however. He gives a general discussion of when we should expect entropy to be extensive and when we should expect it not to be. I suggest re-reading his paper with that in mind.

In the cases where each partion has the same kind of gas, there is no macroscopic variable being changed. As we alread agree, in this case there is no increase in entropy due to mixing. In the cases where there there is increase in entropy due to mixing, you DO have macroscopic variables changing: the partial pressure of each kind of gas.

So we have two possibilities.
1) you start alread in equilibrium when you remove the partition. Them you have no entropy increase. You may put the partion back and remain in equilibrium. In any moment you have extensivity.

2) you start out of equilibrium when you remove the partition. Them you have the partial preassures of each kind of gas to define the macroscopic state of each partition. Again, you have extensivity.

You said that Jaynes writes that entropy of mixing and vapor pressure are instances of non-extensivity. Can you tell us what in Jaynes text gave you this impression?
 
  • #133
Of course, if you have a gas consisting of two kinds of non-identical particles in a large volume you don't separate them by simply putting in a divider adiabatically. You'd just have two parts of the mixed gas in thermal equilibrium and no entropy change in putting in the divider (that's a tautology, because adiabatic means it's without change of entropy). To separate the non-identical particles in two distinct parts of the volume you need to do work and the entropy of the gas sorted into the two parts is lowered. Of course, you need to increase the entropy elsewhere, and the answer to that puzzle is due to Landauer and Szilard using the information-theoretical approach to entropy in connection to the famous "Maxwell demon". Here you have a demon sorting non-identical particles, while in the original Maxwell-demon setup the demon sorts particles by energy. The principle is of course the same.

That this information-theoretical approach is the correct one has been empirically shown recently also in the context with quantum statistics. The quantum Maxwell demon precisely worked as predicted using the information-theoretical approach.
 
  • #134
vanhees71 said:
Of course, if you have a gas consisting of two kinds of non-identical particles in a large volume you don't separate them by simply putting in a divider adiabatically. You'd just have two parts of the mixed gas in thermal equilibrium and no entropy change in putting in the divider (that's a tautology, because adiabatic means it's without change of entropy). To separate the non-identical particles in two distinct parts of the volume you need to do work and the entropy of the gas sorted into the two parts is lowered. Of course, you need to increase the entropy elsewhere, and the answer to that puzzle is due to Landauer and Szilard using the information-theoretical approach to entropy in connection to the famous "Maxwell demon". Here you have a demon sorting non-identical particles, while in the original Maxwell-demon setup the demon sorts particles by energy. The principle is of course the same.

That this information-theoretical approach is the correct one has been empirically shown recently also in the context with quantum statistics. The quantum Maxwell demon precisely worked as predicted using the information-theoretical approach.

Very interesting insight. I supose you agree that increase in entropy due to mixing is not an evidence of non-extensivity. Do you agree with that?
 
  • #135
Yes, as explicitly shown in my previous posting, you get the correct mixing entropy with the correct additive and extensive entropy.
 
  • #136
Just to be clear: the correct mixing entropy is not only extensive entropy but you also have to include additive entropy right? Which means saying mixing entropy is evidence of non-extensivitiy, is correct.
 
  • #137
vanhees71 said:
Yes, as explicitly shown in my previous posting, you get the correct mixing entropy with the correct additive and extensive

So we are in agreement regarding this. However, we are yet to agree in the origin of the ##1/N!##.

vanhees71 said:
The correct answer is you have to take out the ##N!##. This nobody doubts. We have only discussed whether this is justified within a strictly classical model (my opinion is, no) or whether you need to argue either with the exstensivity condition for entropy (which was how Boltzmann, Gibbs et al argued at time), i.e., by a phenomenological argument (i.e., you introduce another phenomenological principle to classical statistical physics) or with quantum mechanics and the indistinguishability of identical particles which is part of the foundations of quantum theory.
vanhees71 said:
It's only a claim that this is an urban myth, but it's not convincing to just claim this. I think you need QT to argue for this factor ##1/N!##.

Also there should be mixing entropy when the constituents of the systems in the two compartments of the Gibbs-paradox setup are distinguishable (and be it only by a seemingly "small" difference, e.g., if you have the same gas but with atoms of different isotopes). I don't know, whether there are experiments which measure such mixing entropies though. In any case I'd be interested in corresponding papers.

I've no clue what the Poisson distribution has to do with all this.

How would you square these statements with the inclusion of the #1/N!# in systems with a large number of pairwise distinguishable elements? These elements can be coloidal particles, such as milk. Could be grains. Could be stars. You could solve a Hamiltonean model with many particles. In all these instances you have to include the #1/N!# to obtain a consistent definition of entropy.
(With the caveat that for systens of gravitating stars there is no entropy formaly defined due to the intrinsic non-extensivity, as you correctly pointed out)

I supose you agree that the symmetries of quantum mechanics are not present in these systems, and can not be used to justify the inclusion of the ##1/N!## term.

You also write that, absent the explanation with identical quantum particles, "you need to argue either with the exstensivity condition for entropy (which was how Boltzmann, Gibbs et al argued at time), i.e., by a phenomenological argument(...)"

My understanding is that Statistical mechanics is a mathematical method. With this method you derive, from the microscopic model, all the phenomenological predictions of thermodynamics. Therefore, there should be a logical reason for the inclusion of the ##1/N!## in these systems. In fact, there is. The logical reason to include this term was presented by Ehrenfest in 1920, by vam Kampen in 1984, and by Swendsen in 2009, among others.

Regarding the question of the poisson distribution. You can derive mathematically the poisson distribution from entropic principles. Of course, to do that you need to follow proper combinatorial logic and include the permutation term argued by the authors I mention in the previous paragraph. This is a pure mathematical problem. One can not say that this term is included to agree with QM or to make entropy extensive. If you are still curious you may check

https://math.stackexchange.com/questions/2241655/maximum-entropy-principle-for-poisson-distribution
 
  • Like
Likes dextercioby
  • #138
PeterDonis said:
I didn't say it made the particles indistinguishable. I said it means ignoring the differences between the particles and therefore treating them as if they were indistinguishable.
If you check the approach of Ehrenfest in 1920, vam Kampen in 1984, or Swendsen in 2009, among others, you will see that they do not treat distinguishable particles as if they were indistinguishable. They include the necessary permutation term that leads to no entropy of mixing for pairwise distinguishable elements precisely to account for the distinguishability. Including this term they obtain that there is no increase in entropy, and no paradox.
 
  • #139
As I said several times the inclusion of the correct factor ##1/N!## is due to quantum-mechanical indistinguishability of identical constituents (atoms/molecules, etc). If you have a system consisting of multiple distinguishable constituents you have to count accordingly, as shown in the example with the separated volume in the Gibbs-paradox setup.
 
  • #140
Motore said:
Just to be clear: the correct mixing entropy is not only extensive entropy but you also have to include additive entropy right? Which means saying mixing entropy is evidence of non-extensivitiy, is correct.
I don't know, what you mean here. The Sackur-Tetrode entropy for an ideal gas is extensive and additive, and using it leads to the correct mixing entropy. So how is this evidence for "non-extensivity"?
 
  • Like
Likes Lord Jestocost

Similar threads

Back
Top