Meaning of equiprobability principle in statistical mechanics

In summary, the conversation discusses the concept of equilibrium ensembles in statistical mechanics and the confusion surrounding their name. The canonical ensemble is described as having a small probability for all molecules in an ideal gas to gather in one corner of the box, even at equilibrium. The ergodic theorem is mentioned as a possible explanation for why statistical mechanics works, but it is deemed too weak by some. The conversation also mentions dissenters to the widely accepted theory and their proposed solutions. The definition of equilibrium is discussed and the point is made that the "equilibrium distributions" in statistical mechanics are more complex than just equilibrium distributions.
  • #1
Zacku
65
0
Hi,

This question is probably a dumb one but I admit that I am quite perturbed with this issue.
Indeed, I don't understand why canonical ensembles like the microcanonical ensemble or
the canonical one are called "equilibrium ensemble".
I do agree that they correspond to steady measures of probability but why do they reffer only to
equilibrium is a mystery for me.

The reason of my misunderstanding is that microstates corresponding to out of equilibrium macrostates are allowed in these ensembles. For example microstates corresponding to non uniform density are allowed in the microcanonical ensemble (for perfect gazes for example) where all microstates in the hypersurface of constant energy (with a given uncertainty) are allowed. Now, I thought that, at equilibrium, the density was uniform in an homogen fluid...

I am more dubious when, searching for answers about foundations of statistical mechanics, I read papers and courses where the Boltzmann H theorem or more generaly the second principle are correctly (it seems) understood assuming that all microstates of the surface at constant energy are "visited" with the same probability (the microcanonical ensemble appears again in a general context this time) and that the macrostate of equilibrium correspond to an overwelming number of microstates compared to non equilibrium microstates.

Thinking a lot about it, it leads me to the conclusion, perhaps the wrong one, that the canonical ensemble distributions refer to systems that can be observed during an infinite time which lead to the equiprobability "principle" (somewhat explain with the historical idea of ergodicty) and the time independence of the distributions but not especially to the equilibrium (macro)states of these systems.

What is your opinion about that ? Thank you for any comments that could help me !
 
Physics news on Phys.org
  • #2
The canonical ensemble asserts that the ideal gas, even at equilibrium, has a minute probability that all the molecules will gather in one corner of the box. This probability is very low, and so we have never seen it.

The ergodic theorem is said by most books to be useless (I haven't seen it myself), because it requires longer than the lifetime of the universe for a system to attain equilibrium.

This is the most widely accepted theory as to why statistical mechanics "works", even though all fundamental physical laws are time reversible (there's some funny stuff in a small part of particle physics, but we don't have to worry about that), is that it has to do with the large numbers of particles.
Mehran Kardar's discusses this in Lecture 9 at:
http://ocw.mit.edu/OcwWeb/Physics/8-333Fall-2005/LectureNotes/index.htm
Also try:
http://arxiv.org/abs/math-ph/0010018

There have been prominent dissenters to the above view, like Prigogine who wanted to add a term to Schroedinger's equation.
 
Last edited by a moderator:
  • #3
atyy said:
The canonical ensemble asserts that the ideal gas, even at equilibrium, has a minute probability that all the molecules will gather in one corner of the box. This probability is very low, and so we have never seen it.
Precisely and that's the reason why I don't understand the well know name "equilibrium ensemble" (it seems that we don't have the same defintion of equilibrium actually and I would like to have your defintion if you don't mind).

The ergodic theorem is said by most books to be useless (I haven't seen it myself), because it requires longer than the lifetime of the universe for a system to attain equilibrium.
This is not the principal reason according to what I read. Actually the KAM theorem and the fact that there exist chaotic dynamical systems for which the ergodic theorem does not hold are the principal ones it seems.

This is the most widely accepted theory as to why statistical mechanics "works"
I do agree with that but as I said, I don't understand the qualifier "equilibrium distribution" because it seems that the reason why it works is only effective thanks to the overwelming number of microstates corresponding to the macrostate of equilibrium.

Mehran Kardar's discusses this in Lecture 9 at:
http://ocw.mit.edu/OcwWeb/Physics/8-333Fall-2005/LectureNotes/index.htm
Also try:
http://arxiv.org/abs/math-ph/0010018

There have been prominent dissenters to the above view, like Prigogine who wanted to add a term to Schroedinger's equation.
I will try to see these refs thanks.
 
Last edited by a moderator:
  • #4
Zacku said:
Precisely and that's the reason why I don't understand the well know name "equilibrium ensemble" (it seems that we don't have the same defintion of equilibrium actually and I would like to have your defintion if you don't mind).

The time after which the macroscopic variables such as pressure, and temperature no longer change.

Zacku said:
This is not the principal reason according to what I read. Actually the KAM theorem and the fact that there exist chaotic dynamical systems for which the ergodic theorem does not hold are the principal ones it seems

Yes, the ergodic theorem holds for most systems we study. I just meant that it is too weak to explain why the systems we study come to equilibrium much more rapidly than the lower limit calculated by the ergodic theorem.
 
  • #5
atyy said:
The time after which the macroscopic variables such as pressure, and temperature no longer change.

Yes but for me this definition is a little "wrong". You just have to consider the answer to the objection of Zermelo to the Boltzmann H theorem that can be said as : "yes the system will eventually enter a microstate corresponding to a macrostate out of equilibrium but you will wait a little longer than the age of the universe in order to see it dude !".
Ok but that's not an answer to me ! This sentence explain why it does work i.e. why we can describe equilibrium systems with statistical distributions containing microstates corresponding to states of non equilibrium but that's all (and this is quite good after all) but what I want to underline is that the "equilibrium distributions" are a little more than just equilibrium distributions. Do you see my point ?
 
  • #6
Zacku said:
I do agree with that but as I said, I don't understand the qualifier "equilibrium distribution" because it seems that the reason why it works is only effective thanks to the overwelming number of microstates corresponding to the macrostate of equilibrium.

The canonical ensemble is given the qualifier "equilibrium" because it is only able to predict the results of classical thermodynamics. Yes, statistical mechanics is only an effective theory - the microcanonical and canonical ensembles are completely different ensembles conceptually - yet they are close enough for most practical purposes.
 
  • #7
Zacku said:
Yes but for me this definition is a little "wrong". You just have to consider the answer to the objection of Zermelo to the Boltzmann H theorem that can be said as : "yes the system will eventually enter a microstate corresponding to a macrostate out of equilibrium but you will wait a little longer than the age of the universe in order to see it dude !".
Ok but that's not an answer to me ! This sentence explain why it does work i.e. why we can describe equilibrium systems with statistical distributions containing microstates corresponding to states of non equilibrium but that's all (and this is quite good after all) but what I want to underline is that the "equilibrium distributions" are a little more than just equilibrium distributions. Do you see my point ?

The definition I gave of equilibrium is just an experimental one. Of course, it may be that experimentally we never find systems in equilibrium, and the theory cannot be applied.

Anyway, I agree with your point about the age of the universe - it's exactly why I said the ergodic theorem is useless in practice - all statistical mechanics is based on a prayer (which appears to be answered affirmatively very often).

Edit: whoops, no I don't see your point. If an equilibrium system takes longer than the age of the universe to become non-equilibrium, then it's as good as being at equilibrium for us.
 
  • #8
There's an interesting comment in Kardar's first lecture:

"A system under study is said to be in equilibrium when its properties do not change appreciably with time over the intervals of interest (observation times). The dependence on the observation time makes the concept of equilibrium subjective.For example, window glass is in equilibrium as a solid over many decades, but flows like a fluid over time scales of millennia. At the other extreme, it is perfectly legitimate to consider the equilibrium between matter and radiation in the early universe during the first minutes of the big bang."

Is it subjectivity of equilibrium that is worrying you?
 
  • #9
atyy said:
Edit: whoops, no I don't see your point.
The fact is that you can make predictions about the probability of occurrence of a non equilibrium state thanks to a so called "equilibrium distribution" (indeed Boltzmann always uses the microcanonical ensemble (or equiprobability principle for all microstate) at least implicitly, in his argumentation (against Loschmidt and Zermelo for example).
This only fact seems very odd to me and that's all.

If an equilibrium system takes longer than the age of the universe to become non-equilibrium, then it's as good as being at equilibrium for us.
You can say that if the system contains the Avogadro's number of particles but what if you are trying to understand fundations of statistical mechanics with arbitrary numbers of particles ?

What i want to discuss actually is that the fact that "all that stuff about ensembles works" and that they can be called "equilibrium ensembles" (it's two different things for me) looks like a big coincidence to me hidden behind the law of great numbers.
 
  • #10
atyy said:
There's an interesting comment in Kardar's first lecture:

"A system under study is said to be in equilibrium when its properties do not change appreciably with time over the intervals of interest (observation times). The dependence on the observation time makes the concept of equilibrium subjective.For example, window glass is in equilibrium as a solid over many decades, but flows like a fluid over time scales of millennia. At the other extreme, it is perfectly legitimate to consider the equilibrium between matter and radiation in the early universe during the first minutes of the big bang."

Is it subjectivity of equilibrium that is worrying you?

That's a part of it yes. I don't know if it the same thing as what he wants to say or the example of window glass is inappropriate in this case. Indeed, it is well known from statistical physics of disordered systems that glass states are not "real" equilibrium states but metastable states. Well that can be interesting because it could make physicist find a more "rigorous" (less subjective) defintion of equilibrium that would make a consensus.
 
  • #11
...microstates corresponding to states of non equilibrium

Thermal equilibrium doesn't exist at the level of microstates. You can say that some particular member of the microcanonical ensemble of a gas consisting of N molecules has all its molecules in a small volume of the total available volume. But then you can write down many observables that only have a particular outcome for a small fraction of all the members of the ensemble.

You can't strike them all out, because for any given member of the ensemble there is an observable that singles it out as special: the observable that measures if a state is the same as the given member.

What you can say is that the members of the ensemble contains a subset of states that is identical to an equilibrium situation of lower entropy. E.g., if the gas was first confined to a smaller volume and then underwent free expansion into a larger volume, then the original set of microstates is contained in the final set of microstates. But this fact combined with the equal a priori probabilities forms the argument why entropy always increases. So, there is nothing strange about that.


If you keep track of the microstates, then you have precise information about the gas anyway and the entropy is zero. It always remains zero. E.g. if you let the volume increase then the state it evolves into is simply related to the original state via a unitary transformations, So it ends up in some unique state. If you don't know which state the gas was in, then the number of states in which it can end up in is equal to the original number of states (because they are related to each other via a unitary transformation).

But for all practical purposes we can replace this smaller set of states by the set of all the states that can exist in the larger volume, despite the fact that most of these states could not have evolved out of the smaller volume. What matters for statistical mechanics is that we can't see the difference between the real probability distribution and the equilibrium probabability distribution on the macro scale.

The entropy has then increased because we cannot tell the microstates in which it can be in apart from the microstates it can't be in. So, we may just as well consider all the microstates to be equally likely. Or you can say that the information about which microstates it can be in has been scrambled by the (intractible) time evolution.

This means that there are correlations in the state of the gas that betrays that it evolved from a low entropy state, but these are assumed to be invisible at the macro level. If we could reverse the direction of time, then these correlations would be relevant and the gas would evolve back to the low entropy initial conditions. So, the H-therem would fail, because the assumption in the H-theorem that the state is randomly chosen is not true. It is never true, of course, but in this case the correlations are relevant.
 
  • #12
Count Iblis said:
Thermal equilibrium doesn't exist at the level of microstates.

That's not what I say, at least, i think.
I know that equilibrium is about macrovariables. I don't see where is the problem if i am trying to build equivalence classes of microstates corresponding to some given values of macrovariables.

You can say that some particular member of the microcanonical ensemble of a gas consisting of N molecules has all its molecules in a small volume of the total available volume. But then you can write down many observables that only have a particular outcome for a small fraction of all the members of the ensemble. You can't strike them all out, because for any given member of the ensemble there is an observable that singles it out as special: the observable that measures if a state is the same as the given member.

I'm not sure i agree. In an ideal gas for instance, you must have uniform temperature, pressure and density at thermodynamic equilibrium. While density is easy to compute everywhere considering a given partition of the phase space, temperature and pressure seem more complicated to evaluate from a given microstate. This often leads to the conclusion that temperature and pressure have only an ensemble average meaning so that a microstate can't refer to particular configurations of the temperature and pressure fields. That's actually not correct, in my opinion. Indeed, following the idea of Boltzmann, one can make a linear partition of the one-particle phase space and counting the fraction of particles in each cell of the partition, one can evaluate in principle, the one particle distribution function f(x,p) for one given microstate. Tha validity of this procedure is ensured by the great number of particles in the system otherwise it doesn't hold but this is not a problem since temperature and pressure have no more meaning either.

What you can say is that the members of the ensemble contains a subset of states that is identical to an equilibrium situation of lower entropy. E.g., if the gas was first confined to a smaller volume and then underwent free expansion into a larger volume, then the original set of microstates is contained in the final set of microstates. But this fact combined with the equal a priori probabilities forms the argument why entropy always increases. So, there is nothing strange about that.

I see your point but I don't understand why it is obvious that equilibrium distribution of an ideal gas in a volume V > Vo must contain the microstates of an ideal gaz at equilibrium in Vo. You see, my point consist in saying "why would you use the equal a priori probabilities principle and what does it mean ? ". I argue that equal a priori probabilites principle is sufficient to describe a system at equilibrium (thanks to the law of great numbers) but not necessary. I'm actually the only one to think that way (as far as i know), and i don't understand why.

If you keep track of the microstates, then you have precise information about the gas anyway and the entropy is zero.
Thanks for the recall but I don't think that's what I am doing.

But for all practical purposes we can replace this smaller set of states by the set of all the states that can exist in the larger volume, despite the fact that most of these states could not have evolved out of the smaller volume. What matters for statistical mechanics is that we can't see the difference between the real probability distribution and the equilibrium probabability distribution on the macro scale.

This is probably the reason why everybody seem to don't understand where is my problem...I know quite well how to do statistical mechanics for practical cases, but my question arises when wondering why does it work. After reading a lot of papers and books on the subject, i could not understand well the transition between invariant measure of probability and equilibrium measure of probability (the transition lies, as you both said, in the non distinction between equilibrium states and infinite time waiting hypothetical states for practical cases).
To be more precise, it seems to me that canonical distributions we use to describe equilibrium contain almost the same information as the "real" equilibrium distributions (which remain unknown practicaly) but contain also less information than the "real" equilibrium distributions.

My only "problem" is actually that I am not able to express clearly my way of thinking and that apparently I am the only one on Earth "perturbed" by this subject.
 
  • #13
Zacku said:
My only "problem" is actually that I am not able to express clearly my way of thinking and that apparently I am the only one on Earth "perturbed" by this subject.

I'm sure not. After explaining the ergodic theorem, my lecturer gave a smirk and said something like "and beyond that, all statistical mechanics is based on a prayer". (N Berker - not sure if I'm quoting him correctly, so if that doesn't sound right, it's my error:smile:)
 
  • #14
I've followed the thread with very interest although i didn't post. I found statistical mechanics a subject that physicist use without caring too much about its foundations; for example in this thread just only one person is replying to the author, thing that never happens in topic about quantum mechanics principles or stuff like that. So i understand Zacku when he feels "alone" but his question is perfectly logic. Rather than insert myself in the dialog with atyy i prefer to answer to "Meaning of equiprobability principle in statistical mechanics".

In the beginning SM born for "reducing" thermodynamics to mechanics. The whole point is that a dynamical system with some statistical hypothesis present a behavior characteristic of thermodynamics systems, such as relaxing to equilibrium. So the first way for justifying the use of the ensembles is to (try to) demonstrate such hypothesis directly from dynamics; this path leads to ergodic problem and there aren't big physical results. Another way a physicist can imagine is to simply promote the use of ensembles to physic's principles. Even if it's a (right) way for the foundations of thermodynamics it loses meaning to apply SM to every other dynamical system and there are important prevision that SM do we cannot neglect.
But there is ANOTHER way completely different and it starts OUT of physics. If i roll a dice and i ask you which number will spot you can only assign equal probability for all numbers, in other words you assign equal probability to every possible states when you cannot do better. If i tell you that my dice is cheated and only even numbers will spot you can do a better prevision. More information you have, more accurate your prevision will be. This how SM works: it makes prevision about the state of system on the partial knowledge you have. The quantity of your disinformation is called "entropy". Your find the state that maximize your disinformation compatible with your a priori information; if you have no information you'll obtain the uniform density (as for the dice), if your only information is the energy you'll obtain the canonical density.
So SM doesn't "guarantee" that prevision works but it brings a criteria for obtaining an estimator based on partial knowledge. You have to view the ensembles in that spirit and never look to the dynamical meaning. Moreover, thermodynamic's systems need bit of information (temperature, pressure and volume) for making excellent prevision. That's very peculiar and belong to great numbers law: lots of microstates with same macrostates behavior.

Ll.
 
  • #15
Llewlyn said:
for example in this thread just only one person is replying to the author, thing that never happens in topic about quantum mechanics principles or stuff like that. So i understand Zacku when he feels "alone" but his question is perfectly logic. Rather than insert myself in the dialog with atyy

Just to point out Count Iblis made an excellent post too (ok, that's 2 people, and now 3:smile:).
 
  • #16
Llewlyn said:
I've followed the thread with very interest although i didn't post. I found statistical mechanics a subject that physicist use without caring too much about its foundations; for example in this thread just only one person is replying to the author, thing that never happens in topic about quantum mechanics principles or stuff like that. So i understand Zacku when he feels "alone" but his question is perfectly logic. Rather than insert myself in the dialog with atyy i prefer to answer to "Meaning of equiprobability principle in statistical mechanics".

In the beginning SM born for "reducing" thermodynamics to mechanics. The whole point is that a dynamical system with some statistical hypothesis present a behavior characteristic of thermodynamics systems, such as relaxing to equilibrium. So the first way for justifying the use of the ensembles is to (try to) demonstrate such hypothesis directly from dynamics; this path leads to ergodic problem and there aren't big physical results. Another way a physicist can imagine is to simply promote the use of ensembles to physic's principles. Even if it's a (right) way for the foundations of thermodynamics it loses meaning to apply SM to every other dynamical system and there are important prevision that SM do we cannot neglect.
But there is ANOTHER way completely different and it starts OUT of physics. If i roll a dice and i ask you which number will spot you can only assign equal probability for all numbers, in other words you assign equal probability to every possible states when you cannot do better. If i tell you that my dice is cheated and only even numbers will spot you can do a better prevision. More information you have, more accurate your prevision will be. This how SM works: it makes prevision about the state of system on the partial knowledge you have. The quantity of your disinformation is called "entropy". Your find the state that maximize your disinformation compatible with your a priori information; if you have no information you'll obtain the uniform density (as for the dice), if your only information is the energy you'll obtain the canonical density.
So SM doesn't "guarantee" that prevision works but it brings a criteria for obtaining an estimator based on partial knowledge. You have to view the ensembles in that spirit and never look to the dynamical meaning. Moreover, thermodynamic's systems need bit of information (temperature, pressure and volume) for making excellent prevision. That's very peculiar and belong to great numbers law: lots of microstates with same macrostates behavior.

Ll.

What I don't understand about this approach is - first I know only the energy, then I calculate. Now if I make a measurement of the pressure, shouldn't I recalculate that my entropy on assumption of fixed energy and pressure? Then entropy would decrease each time I made a measurement?
 
  • #17
Llewlyn said:
I've followed the thread with very interest although i didn't post. I found statistical mechanics a subject that physicist use without caring too much about its foundations; for example in this thread just only one person is replying to the author, thing that never happens in topic about quantum mechanics principles or stuff like that. So i understand Zacku when he feels "alone" but his question is perfectly logic. Rather than insert myself in the dialog with atyy i prefer to answer to "Meaning of equiprobability principle in statistical mechanics".

In the beginning SM born for "reducing" thermodynamics to mechanics. The whole point is that a dynamical system with some statistical hypothesis present a behavior characteristic of thermodynamics systems, such as relaxing to equilibrium. So the first way for justifying the use of the ensembles is to (try to) demonstrate such hypothesis directly from dynamics; this path leads to ergodic problem and there aren't big physical results. Another way a physicist can imagine is to simply promote the use of ensembles to physic's principles. Even if it's a (right) way for the foundations of thermodynamics it loses meaning to apply SM to every other dynamical system and there are important prevision that SM do we cannot neglect.
But there is ANOTHER way completely different and it starts OUT of physics. If i roll a dice and i ask you which number will spot you can only assign equal probability for all numbers, in other words you assign equal probability to every possible states when you cannot do better. If i tell you that my dice is cheated and only even numbers will spot you can do a better prevision. More information you have, more accurate your prevision will be. This how SM works: it makes prevision about the state of system on the partial knowledge you have. The quantity of your disinformation is called "entropy". Your find the state that maximize your disinformation compatible with your a priori information; if you have no information you'll obtain the uniform density (as for the dice), if your only information is the energy you'll obtain the canonical density.
So SM doesn't "guarantee" that prevision works but it brings a criteria for obtaining an estimator based on partial knowledge. You have to view the ensembles in that spirit and never look to the dynamical meaning. Moreover, thermodynamic's systems need bit of information (temperature, pressure and volume) for making excellent prevision. That's very peculiar and belong to great numbers law: lots of microstates with same macrostates behavior.

Ll.

I know all these approaches and actually my favorite is the last one. Better than that, I think that the last procedure you mentioned is the correct one to apply probability theory to classical or quantum mechanics (in a context of foundations of statistical mechanics I recall).
Precisely, this is because of this procedure that I asked the question about the meaning of equiprobability principle. As a matter of fact, if you consider an hamiltonian system for which the energy is a constant of motion (known with a given uncertainty). Then, if this system is composed of N particles and is confined in a volume V you find that the best probabilty distribution a priori is the equiprobability of microstates on the hypersurface at constant energy. Ok. Now, why do everyone say then that "the ensemble equilibrium distribution for this system is the microcanonical one".
How can we say that since we only know E,V and N, we don't even know if the system is at equilibrium or out of equilibrium. Indeed, as we don't know at all what is the microstate of the system, all states corresponding to the values E,V,N are allowed with the same probability : equilibrium ones as well as out of equilibrium ones.
By chance, if N is great it seems that microcanonical averages coincides with the equilibrium values of other macrovariables (for instance intensive ones) but that's all.
 
  • #18
atyy said:
What I don't understand about this approach is - first I know only the energy, then I calculate. Now if I make a measurement of the pressure, shouldn't I recalculate that my entropy on assumption of fixed energy and pressure? Then entropy would decrease each time I made a measurement?

Indeed yes, your entropy decrease if you know more about your system. In fact if you knew all the particles position and velocity of a classical gas your entropy would be zero. You have to look to this quantity not as the quantity that you can measure with an entrometer but at the expectation value that the theory predicts. The decrease of entropy should no be regarded as a contradiction of a second principle, in fact your system is and stay at equilibrium and the value of entropy is fixed, you are only adjusting your prevision based of your knowledge.

Zacku said:
How can we say that since we only know E,V and N, we don't even know if the system is at equilibrium or out of equilibrium. Indeed, as we don't know at all what is the microstate of the system, all states corresponding to the values E,V,N are allowed with the same probability : equilibrium ones as well as out of equilibrium ones.

When you do the ensemble averages you are observing your system on a timescale where you assume that the average properties you are interested in don't change. For example it dosen't make sense to read the prevision of SM for a gas at microscopic scale because that isn't the correct timescale. Besides it may happens that you try to apply SM to a gas evolving at macroscale, for example an adiabatic expansion. The SM allows you to derive the collective properties of your system that are dependent from THAT state. So if your system is evolving you of course cannot taking seriously the predictions because the state, in terms of macroscopic variables, is not the same. Trying to describe macroscopic evolution in density operator is non-equilibrium SM, a fertile research field (which i know nothing).

Ll.
 
  • #19
Llewlyn said:
When you do the ensemble averages you are observing your system on a timescale where you assume that the average properties you are interested in don't change.
I'm not sure about that. Ensemble averages also exist in the case of non equilibrium statistical mechanics and correspond to macrovariables at a given time to. Now at a time t+dt you can have a different probability distribution which leads to different values of the macrovariables of interest...but I'm not sure I understood what you meant...

For example it dosen't make sense to read the prevision of SM for a gas at microscopic scale because that isn't the correct timescale.
Of course because all the information about the system is known and probability can't come from nowhere (it comes from a lack of information).

Besides it may happens that you try to apply SM to a gas evolving at macroscale, for example an adiabatic expansion. The SM allows you to derive the collective properties of your system that are dependent from THAT state.
Which state ?

So if your system is evolving you of course cannot taking seriously the predictions because the state, in terms of macroscopic variables, is not the same. Trying to describe macroscopic evolution in density operator is non-equilibrium SM, a fertile research field (which i know nothing).
Ll.
It can actually be done, you can see some papers of Roger Balian on this subject here
http://arxiv.org/abs/cond-mat/9907015v1
 
  • #20
You're right, some of my statement fail.
The paper is very interest I'm going to read.

Which is now the exact question of the thread?

Ll.
 
  • #21
Which is now the exact question of the thread?

There is an ensemble of questions. :approve:
 
  • #22
Count Iblis said:
There is an ensemble of questions. :approve:

That's quite right :biggrin:.

Llewlyn said:
Which is now the exact question of the thread?

One of the issue I want to stress can be expressed by a sentence I said above :

Then, if this system is composed of N particles and is confined in a volume V you find that the best probabilty distribution a priori is the equiprobability of microstates on the hypersurface at constant energy. Ok. Now, why do everyone say then that "the ensemble equilibrium distribution for this system is the microcanonical one".
How can we say that since we only know E,V and N, we don't even know if the system is at equilibrium or out of equilibrium.

We all now here a part of the answer i.e. why it is correct for practical purposes but the fact is that I never read any comment about that point in the papers or textbooks I could study on the subject; microcaninical ensemble is always directly considered as an equilibrium ensemble.
 
  • #23
Llewlyn said:
Indeed yes, your entropy decrease if you know more about your system. In fact if you knew all the particles position and velocity of a classical gas your entropy would be zero. You have to look to this quantity not as the quantity that you can measure with an entrometer but at the expectation value that the theory predicts. The decrease of entropy should no be regarded as a contradiction of a second principle, in fact your system is and stay at equilibrium and the value of entropy is fixed, you are only adjusting your prevision based of your knowledge.

I wasn't thinking so much about measuring the microscopic properties of the system like position and velocity. I was thinking about a macroscopic property like pressure. In the "information only" approach, we should in principle throw out from our microcanonical ensemble, all the microstates not consistent with the measured pressure (now I understand why Zacku said that:smile:). So now we would have a different ensemble, yet it seems that measuring a macroscopic property should not require that we stop using the microcanonical ensemble.

Also, the "information only" approach should contain physics. Although we do not know the initial microstate, we do know the microscopic Hamiltonian dynamical equations for the system, and that should count as prior knowledge. In fact, the Hamiltonian dynamics is the reason we know in advance that the energy is constant.

Going back to the idea that we should throw out microstates if we measure the pressure, I agree with Zacku that the reason we are saved is that it doesn't matter too much whether we use the microcanonical, canonical, etc ensembles. They all miraculously give us the same information. And they also miraculously describe equilibrium for systems with completely different Hamiltonians.

As Zacku pointed out, the central limit theorem is another case where we also miraculously get the same result regardless of the detailed composition of the system.

Yet another case where the details of the system don't seem to matter is for the exponents near the critical point, which seem to be the same for all sorts of different materials. I think the nice thing about this case is we have a good theory, for some model systems, of why the microscopic Hamiltonian permits macroscopic "universality".

So I think the "information alone" approach in the first place is not a good description of itself. Also, it should be the microscopic physics that permits or does not permit us to "coarse grain" successfully. Pretty much all the same points Zacku made - except he likes the "information only" approach!
 
  • #24
Zacku said:
It can actually be done, you can see some papers of Roger Balian on this subject here
http://arxiv.org/abs/cond-mat/9907015v1

Thanks! Interesting stuff, especially his other work on quantum measurement. We shall see if it works out, but I much prefer it to "many worlds"!:smile:
 
  • #25
Mmmh.. i follow you bad.
The troubles concern only the microcanonical distribution ? If yes why the canonical is an suitable distribution for equilibrium?
Microcanonical is a very peculiar way to manage information: it consists in a zero-knowledge system on a particular set of states. In other words the information is directly inserted in the Hilbert space and not treated as "statistical" (does it sound correct to you?). In the "information-only" approach when we know things we put in the exponential obtaining something similar to Boltzmann-Gibbs distribution, i.e. the canonical one. What is not trivial (for me) is to understand why that description is correct only for equilibrium state (your point?).

Besides it may happens that you try to apply SM to a gas evolving at macroscale, for example an adiabatic expansion. The SM allows you to derive the collective properties of your system that are dependent from THAT state.

I was referring to "state" in terms of macrostate, i.e. description by constraints that I've assumed when i maximized entropy. Intuitively i was thinking that since the properties are dependent from state the previsions will not work if state change (bad english and thought obscure.. am i clear?).

Ll.
 
  • #26
Zacku said:
I know all these approaches and actually my favorite is the last one.
How can we say that since we only know E,V and N, we don't even know if the system is at equilibrium or out of equilibrium. Indeed, as we don't know at all what is the microstate of the system, all states corresponding to the values E,V,N are allowed with the same probability : equilibrium ones as well as out of equilibrium ones.

Wait but, what do you mean for equilibrium? If you know E,V, N and you build the microcanonical you are out of equilibrium if E,V or N changes in time. And your description of course is not correct anymore.

Ll.
 
  • #27
Llewlyn said:
Wait but, what do you mean for equilibrium? If you know E,V, N and you build the microcanonical you are out of equilibrium if E,V or N changes in time. And your description of course is not correct anymore.

Ll.

Excuse me but if you yake a system of N particles in a box of volume V for which the energy is a constant of motion, I don't understand where you see any sign of equilibrium or invsersly of out of equilibrium. Equilibrium for me need more information about the system actually, as to make a partition of the space and measure the value of intensive variables (such as temperature or particle density) in each subspace of the partition etc...

The troubles concern only the microcanonical distribution ?
No it concerns all the canonical ensembles according to me but perhaps I am wrong. If I talk quite frequently about the microcanonical ensemble it is because it is the must simpler ensemble to understand whatever the point of view adopted (the ergodic one, the Bayesian one etc..).

In the "information-only" approach when we know things we put in the exponential obtaining something similar to Boltzmann-Gibbs distribution, i.e. the canonical one. What is not trivial (for me) is to understand why that description is correct only for equilibrium state (your point?).
Yes i think.
 
  • #28
Zacku said:
Excuse me but if you yake a system of N particles in a box of volume V for which the energy is a constant of motion, I don't understand where you see any sign of equilibrium or invsersly of out of equilibrium. Equilibrium for me need more information about the system actually, as to make a partition of the space and measure the value of intensive variables (such as temperature or particle density) in each subspace of the partition etc...

It sounds similar at a thermodynamic equilibrium, that exists only for a subclass of systems like gasses. For example consider a gravitating systems of particles, SM predicts as "equilibrium state" the gravitational collapse, but.. can u define some intensive quantity? it is absolutely not homogeneous. I define equilibrium has "averages quantity that do not change in time" i require that definition than "no gradient everywhere" 'cause in many dynamical systems there are equilibrium states not homogeneous (all except thermodynamical).

(but maybe are all ******** :-))

Ll.
 
  • #29
Llewlyn said:
It sounds similar at a thermodynamic equilibrium, that exists only for a subclass of systems like gasses. For example consider a gravitating systems of particles, SM predicts as "equilibrium state" the gravitational collapse, but.. can u define some intensive quantity? it is absolutely not homogeneous. I define equilibrium has "averages quantity that do not change in time" i require that definition than "no gradient everywhere" 'cause in many dynamical systems there are equilibrium states not homogeneous (all except thermodynamical).

(but maybe are all ******** :-))

Ll.

I agree with you the example of the density of particles is rather limited to homogeneous gases (but an exterior field is needed). However, the others "natural" intensive macrovariables such as pressure or temperature must have stationary values "everywhere" at equilibrium.
 
  • #30
The "natural intensive variable" are lagrange multipliers of the extensive one. How can you tell that are constants?

BTW i sent you PM

Ll.
 
  • #31
In general you don't have extensive variables. I think there as been some recent work on self gravitating systems and it was argued that you could treat these systems well within the microcanonical ensemble but not in the canonical ensemble.
 
  • #32
Count Iblis said:
In general you don't have extensive variables. I think there as been some recent work on self gravitating systems and it was argued that you could treat these systems well within the microcanonical ensemble but not in the canonical ensemble.

This is just because in gravitational systems we are dealing with long range interactions and it is well known nowadays that in this case the ensemble equivalence break down, this is a "new field" (that started about 10 years ago i think) in statistical mechanics of system in or out of equilibrium.

Llewlyn said:
The "natural intensive variable" are lagrange multipliers of the extensive one. How can you tell that are constants?
Yes but you can measure them, can't you ? Pressure and temperature are not "impossible to measure variables" as far as I know.
Although I think the Bayesian point of view is the correct procedure to use probability theory to every dynamical system (that's what I think at least) there are things in it that are quite obscure for me. As the meaning of statistical averages to express macrovariable values one measured on a system. In the case of E,V,N Balian would say "ok we have
<E>,V,<N> and then we can apply the maximum entropy principle and tadaa we find the grand canonical distribution". I actually don't understand very well the meaning of ensemble averages if we are interested in one system (as it is often the case in practice). I don't understand either if saying that <E> is constant means that we are at equilibrium or not.
What is more troublesome for me is that, traditionally, canonical ensemble and grand canonical ensemble are "justified" by the fact that there is an equivalence ensemble with the microcanonical ensemble (which is the only ensemble that can be found without statistical postulates about what is measured in both the ergodic and Bayesian point of view).
 
  • #33
Zacku said:
Yes but you can measure them, can't you ? Pressure and temperature are not "impossible to measure variables" as far as I know.

For me it is not so obvious, ad example temperature may assume negative values in SM. In general i think as "lagrange's multipliers" and only when the system is thermodynamic i associate the meaning of temperature or pressure, due to the thermodynamic's laws. The definition of equilibrium i gave soddisfies myself, although I'm sure I've not completely understand the matter.

Ll.
 
  • #34
Llewlyn said:
The definition of equilibrium i gave soddisfies myself, although I'm sure I've not completely understand the matter.
Ll.

How can you apply the definition of equilibrium you gave to the simple case of a N particles gas in a box of volume V and constant energy E (thanks to the hamiltonian dynamics) ?
The fact is that I never read (if I remember well) a paper about out of equilibrium statistical mechanics in which an ensemble distribution of probability tends to a canonical distribution at equilibrium (where the word "tends" as to be clarified since it must have only a statistical meaning as in the Boltzmann H theorem).
 
  • #35
atyy said:
So I think the "information alone" approach in the first place is not a good description of itself. Also, it should be the microscopic physics that permits or does not permit us to "coarse grain" successfully. Pretty much all the same points Zacku made - except he likes the "information only" approach!

Thinking about that since the opening of this thread, I think that the indifference principle allows one to retrieve the microcanonical distribution with a correct use of probability theory.
However, its other use - that is when assuming that the values of measured macrovariables are ensemble averages (as stated for instance R. Balian)- is more difficult to understand for me. In quantum mechanics when we make a measure, we must find eigenvalues of the observable of interest, that is, we don't find a quantum average value unless we make more than one measure.
As a matter of fact, if one is able to know the expression of a macrovariable observable, then an apparatus will measure a time average of this quantity (even if we are talking about QM). Assuming that this time average equals an ensemble average is equivalent, according to me, to the ergodic problem.
It seems, as we said earlier, that the answer of the statistical mechanics "problem" is not the ergodic theroem since, despite the famous work of Sinai on this subject, is to restrictive to build the bridge between time averages and ensemble averages.
It seems that the answers is perhaps in the ideas rised by Khinshin (I'm reading his book on statistical mechanics ).
So I agree with your comment atyy, there must be something else than only statistical inference (except for the microcanonical case i would say) to explain canonical ensembles. Noting that it doesn't solve the problem of the existence, or not, of a real ensemble distrubution probability for system at equilibrium.
 
Back
Top