States of equal energy are equally probable

In summary, the conversation revolves around the concept of thermal equilibrium and its relation to entropy and information theory in statistical mechanics. The initial question is whether the equal probabilities of states with the same energy in thermal equilibrium is an empirical observation or can be derived from quantum mechanics. It is then mentioned that the concept of entropy in information theory is related to the measure of missing information in a system. The conversation also touches upon the ergodic hypothesis and the different ensembles used in statistical mechanics. The speaker argues that the equal probabilities of states with the same energy only applies to a specific state of the system, the microcanonical ensemble, and different probabilities are observed in the canonical and grand canonical ensembles when taking into account the states of the reservoir. However,
  • #1
msumm21
223
16
TL;DR Summary
Question about why states of equal enery are equally probable
I'm reading "Statistical Mechanics: A Set of Lectures" by Feynman.

On page 1 it says that, for a system in thermal equilibrium, the probabilities of being in two states of the same energy are equal. I'm wondering if this is an empirical observation or if it can be derived from QM?
 
Physics news on Phys.org
  • #2
Thinking about statistical physics becomes much clearer when you have an idea about information theory. In information theory, entropy is the "measure of missing information" under the constraint of what information is given about the system.

Thermal equilibrium is the state of maximum entropy, i.e., you maximize the entropy with the information given just about the additive conserved quantities.

In quantum statistical physics the entropy is given by
$$S=-k_{\text{B}} \mathrm{Tr} (\hat{\rho} \ln \hat{\rho}),$$
where ##\hat{\rho}## is the statistical operator. If you know that your system is in a state of definite energy ##E## ("microcanonical ensemble"). Let now ##|E,\alpha \rangle## be a set of orthonormal energy eigenstates with the energy eigenvalue ##E##. Since now the equilibrium state commutes with ##\hat{H}##, because it's time-independent, you can choose his orthonormal set to be the eigenvectors of ##\hat{\rho}##, i.e., you have
$$\hat{\rho}=\sum_{\alpha} p_{\alpha} |E,\alpha \rangle \langle E,\alpha|,$$
and the entropy is
$$S=-k_{\text{B}} p_{\alpha} \ln p_{\alpha},$$
and this must be maximized under the contraint that ##\mathrm{Tr} \hat{\rho}=\sum_{\alpha} p_{\alpha}=1##. This you do with a Lagrange multiplier in the variation of ##S##:
$$\delta S= -k_{\text{B}} \sum_{\lambda} \delta p_{\alpha} (\ln p_{\alpha} -1-\lambda)=0.$$
Now you can vary the ##p_{\alpha}## independently, and thus the bracket under the sum must vanish for each ##\alpha##. This implies that
$$\ln p_{\alpha} =1+\lambda=\text{const} \; \Rightarrow \; p_{\alpha}=p.$$
Further you must have
$$\sum_{\alpha} p=g p,$$
where ##g=\mathrm{dim} \text{Eig}(E)## is the degneracy of the energy eigenvalue. Thus you get ##p=1/g## and
$$\hat{\rho} = \frac{1}{g} \sum_{\alpha} |E,\alpha \rangle \langle E,\alpha|.$$
From the information-theoretical point of view that's pretty obvious: If you don't know anything else about the system than its energy, then you only know that it must be in an energy eigenstate with the corresponding eigenvalue, ##E##, and thus the choice of probabilities of the "minimal prejudice" (i.e., that of the maximal missing information or entropy) is such that none of these energy eigenstates is in any way distinguished from the others, i.e., they must all be equally probable.

The issue becomes a bit more complicated, if your Hamiltonian has a continuous spectrum. Then the grand-canonical ensemble must be defined such that the energy value is in some "small energy shell", ##E \in [E_0-\delta E/2,E+\delta E/2]##.
 
  • Like
  • Informative
Likes Ken G, Fra, Lord Jestocost and 3 others
  • #3
msumm21 said:
Question about why states of equal enery are equally probable
This is true only in one particular state of the system, namely in the microcanonical ensemble. Far more common in statistical mechanics are the canonical ensemble and the grand canonical ensemble, where the probabilities are very different.
 
Last edited:
  • Like
Likes dextercioby, vanhees71 and Fra
  • #4
I guess the question is answered already, but thinking another round. All this rests however on some form of ergodic hypothesis or assumption that each microstate has equal a priori probability. This may seen "natural", but it's I think not innocent. The problem is that you can handpick the microstructure to bias the expectations on the macrostate in the desired direction, making the arguments subjective or relative to the microstructure.

So the question is how do you motivate the microstate in the first place, without making alot of finetuning in the "space of micostructures". It seems you can often make up a microstructure to make a entropic argument at the macroscopic level. But the explanatory power is rooted in the choice of microstructure. Lots of thins here is no satisfactory. In classical physics this is ignored, but the problem arises when you want the entropic arguments to be well motivated from a real agent. We get a tasty bayesian soup.

/Fredrik
 
  • #5
vanhees71 said:
From the information-theoretical point of view that's pretty obvious: If you don't know anything else about the system than its energy, then you only know that it must be in an energy eigenstate with the corresponding eigenvalue, E, and thus the choice of probabilities of the "minimal prejudice" (i.e., that of the maximal missing information or entropy) is such that none of these energy eigenstates is in any way distinguished from the others, i.e., they must all be equally probable
Thanks for the response. It seems like that argument, applied to a weighted die, would say that each side is equally probable. If I know the system (die), but not the state it is in (side faced up after rolling), then the condition of max uncertainty is a function of the weighting, not 1/Nsides. E.g. if it's weighted to roll 1 with probability=25% then the state of max uncertainty should also say 25% 1, right? If I claim the probability of 1 is less than 25% I must have some "extra information" about the way it was rolled or something.

A. Neumaier said:
This is true only in one particular state of the system, namely in the microcanonical ensemble.
Thanks. I'm disappointed the book didn't mention this. So whatever they derive from this will be limited to special systems, maybe they'll mention later, or it will become clearer.
 
  • Like
Likes vanhees71
  • #6
Of course, if your die is "unfair", then you have to assign new probabilities for the outcome, given the information that it's weighted. The most simple solution is to just experimentally determine (approximate) probabilities for each outcome.
 
  • #7
A. Neumaier said:
This is true only in one particular state of the system, namely in the microcanonical ensemble. Far more common in statistical mechanics are the canonical ensemble and the grand canonical ensemble, where the probabilities are very different.
But to derive the statistics in the canonical and grand canonical ensembles, you must still assume that the probabilities of the system being in any microstate are all equal. It is when one takes into account the states of the reservoir that ones gets different probabilities of finding the system in a given state.
msumm21 said:
Thanks. I'm disappointed the book didn't mention this. So whatever they derive from this will be limited to special systems, maybe they'll mention later, or it will become clearer.
As noted above, I disagree with this interpretation and think that the statement is universal. One only has to be careful in what it means.
 
  • #8
Fra said:
how do you motivate the microstate in the first place,
It is just the definition of the special state called microcanonical.
msumm21 said:
Thanks. I'm disappointed the book didn't mention this.
This is just the conventional starting point, corresponding to a noninformative prior (that you know nothing at all about a system). Once you assume that you can know more, and in the applications you alway have to know more to do something useful, this potentially available knowledge affects the potenital states. For example, assuming that the only thing that matters (and hence can be known) is the mean energy, you get the canonical ensemble, parameterized by temperature in 1-1 correspondence with the possible mean energies.
msumm21 said:
So whatever they derive from this will be limited to special systems, maybe they'll mention later, or it will become clearer.
It will become clearer when they introduce the other ensembles.
DrClaude said:
But to derive the statistics in the canonical and grand canonical ensembles, you must still assume that the probabilities of the system being in any microstate are all equal.
Only if one want to derive these ensembles from the microcanonical ensemble by means of thre maximum entropy principle.

But to define the canonical and grand canonical ensembles, there is no need at all to start from the microcanonical ensemble or to invoke the maximum entropy principle. As everywhere in quantum mechanics, the state determines the probabilities, and one can just postulate the family of states of interest - as it is done elsewhere in quantum mechanics. (For example, coherent states in quantum optics are not derived from a noninformative prior, but just postulated becaucse of their quasiclassical behavior.)

This is actually much more economical than proceeding the traditional way. See the chapters on statistical mechanics in my online book
There the microcanonical ensemble is not even introduced, and still everything of interest in equilibrium thermodynamics is derived in an efficient manner.
 
  • #9
DrClaude said:
But to derive the statistics in the canonical and grand canonical ensembles, you must still assume that the probabilities of the system being in any microstate are all equal. It is when one takes into account the states of the reservoir that ones gets different probabilities of finding the system in a given state.
The distinction between being in some (pure, micro-)state and finding in some (mixed, macro-)state is frought with difficulties.

Thinking the system is actually in a pure state, and the mixed state is only due to lack of knowledge is fallacious. Indeed, each system that can be observed is obtained from a larger system containing part of its environment by taking a partial trace. This partial trace introduces already the mixedness, independent of any observation!
 
Last edited:
  • #10
Of course both the canonical and the grand-canonical ensemble are derived by the maximum entropy principle. The canonical ensemble is defined as a system with the particle number exactly given (e.g., a gas confined in a container) but coupled to "a heat bath", i.e., energy/heat can be exchanged with the environment. In this case what's known is the fixed number of particles and the average energy. The entropy must be maximized under the constraints ##\mathrm{Tr} \hat{\rho}=1## and ##\langle H \rangle=U=\mathrm{Tr} (\hat{\rho} \hat{H})##. The trace has to be taken in the subspace of fixed particle number ##N## (to make sense for an equilibrium state, this particle number must be conserved). So you have
$$\delta S - k_{\text{B}} \beta \mathrm{Tr} \hat{H} \delta \hat{\rho} -k_{\text{B}} (\Omega-1) \mathrm{Tr} \delta \hat{\rho}=0,$$
where ##\beta## and ##\Omega+1## are Lagrange multipliers. With
$$\delta S = -k_{\text{B}} \delta \mathrm{Tr} \hat{\rho} \ln \hat{\rho}=-k_{\text{B}} \mathrm{Tr} \delta \hat{\rho}(1+\ln \hat{\rho}).$$
Thus the maximum-entropy principle implies
$$-\ln \hat{\rho}-1-\Omega+1 -\beta \hat{H}=0 \; \Rightarrow \; \hat{\rho}=\exp(-\beta \hat{H}-\Omega).$$
From ##\mathrm{Tr} \hat{\rho}## you get
$$\exp(-\Omega) Z=1 \; \Rightarrow \; \Omega=\ln Z \quad \text{with} \quad Z=\mathrm{Tr} \exp(-\beta \hat{H}).$$
In the same way the grand-canonical ensemble is treating the system as part of a larger system, such that both energy and particles can be exchanged. The constraints are thus that the average energy and average particle number are given, and you get the corresponding Lagrange multipliers, ##\beta## and ##\alpha##. The final result is
$$\hat{\rho}=\frac{1}{Z} \exp(-\beta \hat{H}+\alpha \hat{N}), \quad Z=\mathrm{Tr} \exp(-\beta \hat{H} + \alpha \hat{N})$$
Usually instead of ##\alpha## you introduce the chemical potential via ##\alpha=\beta \mu##.

Of course non-thermal states you don't derive in this way from the maximum-entropy principle, although you can of course use the maximum-entropy principle to derive any kinds of probabilities by maximizing the entropy, given the information you have, such that you get the probabilities of "minimal prejudice".

In other cases, like coherent states in quantum optics, you get the state in other ways. E.g., coherent states follow from the assumption that an electromagnetic field is due to a classical charge-current distribution. This has nothing to do with equilibrium thermodynamics though.
 
Last edited:
  • #11
msumm21 said:
Thanks for the response. It seems like that argument, applied to a weighted die, would say that each side is equally probable. If I know the system (die), but not the state it is in (side faced up after rolling), then the condition of max uncertainty is a function of the weighting, not 1/Nsides. E.g. if it's weighted to roll 1 with probability=25% then the state of max uncertainty should also say 25% 1, right? If I claim the probability of 1 is less than 25% I must have some "extra information" about the way it was rolled or something.
I am very confused by your asking this question in the context of energy eigenstates. The "weighted" die is not an isolated system: it is weighted only because it is strongly coupled to the earth. Then the six possible outcomes of the earth-die system are not six-fold degenerate (or display a hierarchy of energies classically)
/
 
  • #12
A. Neumaier said:
It is just the definition of the special state called microcanonical.
Yes, but I didn't mean it's a mathematical problem, one one axiomatically construct many physical theories, this is fine from the mathematical perspective of physical theories. But axioms that are chosen, does not come with any explanatory value.

The question for me was, howto motivate those structures from the perspective of empirical inference of a real agent. Knowing the complete microstate of an identified microstructure and a given prior, you can "explain" by entropic/probabilisitc arguments various things about macrostates that are _define_ but averaging out complexions. But WHO is the agent doing this average? If it's just a fictive matehmatical averaging then, that is the point, and for me it's a conceptual problem.

I think that does not reflect the situation of a intrinsic learning agent, as many of those things themselves are unknown. So the explanatory power still rests on arbitrary choices and definitions that needs to be tuned to the desired result. This plauges alot of "entropic reasoning", that the argument which is sound, relies on choised that are not explained.

/Fredrik
 
  • #13
vanhees71 said:
Of course both the canonical and the grand-canonical ensemble are derived by the maximum entropy principle.
can be derived, but not must be derived!

vanhees71 said:
In other cases, like coherent states in quantum optics, you get the state in other ways.
These other ways are also available in equilibrium. An exponential form of the density is very natural and in many cases numerically tractable. This is sufficient motivation to directly postulate them.
 
Last edited:
  • #14
Equilibrium is defined as the maximum-entropy state through the H-theorem.
 
  • Like
Likes dextercioby
  • #15
vanhees71 said:
Equilibrium is defined as the maximum-entropy state through the H-theorem.
Equilibrium can also be defined through specifying the grand canonical ensemble. One can then prove the chemically much more relevant extremum prinicples of minimum internal energy resp. minimal Helmholtz energy, depending on the boundary conditions assumed. See Chapter 9 of my online book.

Specifying the grand canonical ensemble by a definition (which immediately yields a lot of applications) is much preferable to first specifying by a definition the microcanonical ensemble as prior (which is almost useless in practice) and then invoking the maximum entropy principle to derive the grand canonical ensemble.

In chemistry, the entropy is maximal only assuming a thermally and mechanically isolated system, a fairly unnatural assumption.
 
  • Like
Likes hutchphd and dextercioby
  • #16
The grand-canonical ensemble is of course following from the maximum-entropy principle. How else would you derive it? I have no clue, what you have against the maximum-entropy principle, which is the most natural road to statistical mechanics through the H-theorem.

Of course all the other extremum principles of "chemistry" follow as usual from thermodynamics using the corresponding Legendre transformations between the various thermodynamic potentials.

The derivation of the grand-canonical ensemble from the maximum-entropy principle is indeed simpler than for the canonical ensemble. By definition the grand-canonical ensemble describes a system that can exchange energy and particles with its environment. The equilibrium distribution follows the from maximizing the entropy under the constraint that the averages of the additive conserved quantities of the system are fixed, i.e., by
$$\delta S+\alpha \delta \langle N \rangle - \beta \delta \langle E \rangle +(1+\Omega) \delta \hat{\rho}=0,$$
i.e.,
$$-\ln \hat{\rho} + \alpha \hat{N} -\beta \hat{H} + \Omega=0$$
or
$$\hat{\rho}=\exp (\Omega) \exp(-\beta \hat{H} + \alpha \hat{N}), \quad \Omega=-\ln Z, \quad Z=\mathrm{Tr} \exp(-\beta \hat{H}+\alpha \hat{N}).$$
 
  • #17
vanhees71 said:
The grand-canonical ensemble is of course following from the maximum-entropy principle. How else would you derive it?
As I understand Neumaier he does not suggest not to derive it, but to just define it, and thus postulate the concepts that defines the equiblirium such as potential and temperature?

The problem I see of the max ent principle is that it is either, ambigous(requiring fine tuning) or subjective(requiring evolution). But Neumaiers proposed solution to this is different than my thinking so Im not sure about the ultimate motivation.

/Fredrik
 
  • #18
The great discovery by Maxwell, Boltzmann, et al. in the 19th century was that thermodynamics can be derived from the fundamental laws of dynamics by statistical methods, and the maximum-entropy principle follows from the fundamental laws. Why should you "just define" a probability distribution, if you can derive it from the fundamental laws of nature? Physics is not politics, where you tweak and twist the statistics of all kinds of things such as to get out what you want to "sell" to the people but a method to figure out, how things really are ;-)).
 
  • #19
vanhees71 said:
thermodynamics can be derived from the fundamental laws of dynamics by statistical methods, and the maximum-entropy principle follows from the fundamental laws
As I understand Neumaier, he simply does not want to rely on wrong claims. There are too many possible maximum-entropy principles, and therefore a statement like "the maximum-entropy principle follows from the fundamental laws" cannot be true without further qualifications.
 
  • Like
Likes Fra
  • #20
It's very fundamental. Of course, there's the caveat that the interactions must fall off quickly enough. The standard Boltzmann-Gibbs statistics thus does not apply to gravitating systems, and that's why there is structure in the universe.

The H-theorem follows from the unitarity of the time evolution of quantum (field) theory, because it implies the (weak) detailed-balance property of S-matrix element and this finally implies the H-theorem (btw. together with the standard Shannon-Jaynes-von-Neumann definition of the entropy).
 
  • Like
Likes dextercioby
  • #21
vanhees71 said:
The great discovery by Maxwell, Boltzmann, et al. in the 19th century was that thermodynamics can be derived from the fundamental laws of dynamics by statistical methods, and the maximum-entropy principle follows from the fundamental laws. Why should you "just define" a probability distribution, if you can derive it from the fundamental laws of nature?
Perhaps the microstate of the universe is not available in any remotely practical way? Its an abstraction only.

And there is only one universe - none can "prepare it".
vanhees71 said:
Physics is not politics, where you tweak and twist the statistics of all kinds of things such as to get out what you want to "sell" to the people but a method to figure out, how things really are ;-)).
There is no external judge for agents. They can only assume that if they are able to coexist with fellow agents they cant be too far off.

Indeed the maxent principle is effectively a tautology but the problem is that defining the space of the distributions needs to be done first. Where are the probability spacea in nature? For me the is only one motivatef place - the microstate of the agent/observer. And this turns the maxent principle into a subjective thing, that also more naturally unifies with the principle of least action if you think about the agents biased actions.

But Neumaier as I interpret his thermal interpretation its a conservative solution.

/Fredrik
 
  • Like
Likes gentzen
  • #22
vanhees71 said:
The grand-canonical ensemble is of course following from the maximum-entropy principle. How else would you derive it? I have no clue, what you have against the maximum-entropy principle, which is the most natural road to statistical mechanics through the H-theorem.
I have nothing against using it where it applies. (That it gives unphysical results when used where it does not apply is demonstrated in Section 10.7 of my online book.)

But I find it important to point out equivalent alternatives that I find much simpler (especially when one wants to do things rigorously).

One can
  • either postulate the microcanonical ensemble and then derive the grand-canonical ensemble from the maximum-entropy principle. This is the traditional route taken.
  • or postulate the grand canonical ensemble and then has need neither for the microcanonical ensemble nor for the maximum-entropy principle. Indeed, the latter can then be proved for very special boundary conditions (which are almost never satisfied in practice). See Chapter 10 of my online book.
 
Last edited:
  • Like
Likes gentzen
  • #23
Why should one postulate something, which can be derived? I find it not very intuitive to simply guess some probabilities.

I find it irritating to claim that the maximum entropy principle were "almost never satisfied in practice". Experience shows the contrary, i.e., the validity of the H-theorem and the 2nd law of thermodynamics.
 
  • Skeptical
Likes weirdoguy
  • #24
vanhees71 said:
the maximum-entropy principle follows from the fundamental laws.
No. The maximum entropy principle must be imposed in addition to the fundamental laws to get the coarse-grained descriptions by the traditional route.
vanhees71 said:
Why should you "just define" a probability distribution, if you can derive it from the fundamental laws of nature?
You ''just define'' another probability distribution, namely that of the microcanonocal ensemble, which is input to your application of the maximum entropy principle.
vanhees71 said:
The H-theorem follows from the unitarity of the time evolution
No. The H-theorem implies dissipation whereas the unitarity of the time evolution forbids disspation.
vanhees71 said:
Why should one postulate something, which can be derived?
You postulate the microcanonical ensemble and the maximum entropy principle, and derive the grand canonical ensemble.

I postulate the grand canonical ensemble, and derive the maximum entropy principle in its correct physical context (namely for the thermodynamic boundary conditions where entropy actually increases).
vanhees71 said:
I find it irritating to claim that the maximum entropy principle were "almost never satisfied in practice". Experience shows the contrary, i.e., the validity of the H-theorem and the 2nd law of thermodynamics.
In many chemical processes observed in Nature entropy decreases, and the maximum entropy principle is misleading. It is not a universally valid principle!!!

This makes my postuilates superior to yours, which cannot tell when the maximum entropy principle is applicable and when not.
 
Last edited:
  • Like
Likes dextercioby, weirdoguy and gentzen
  • #25
A. Neumaier said:
No. The maximum entropy principle must be imposed in addition to the fundamental laws to get the coarse-grained descriptions by the traditional route.
The maximum entropy principle follows from the H-theorem, which can be derived. The most tricky assumption is of course the "molecular-chaos assumption" in deriving the Boltzmann equation.
A. Neumaier said:
You ''just define'' another probability distribution, namely that of the microcanonocal ensomble, which is input to your application of the maximum entropy principle.
The microcanonical distribution, i.e., the equal probability of the energy eigenstates, we discuss here, of course also follows from the maximum-entropy principle, as shown above in #2.
A. Neumaier said:
No. The H-theorem implies dissipation whereas the unitarity of the time evolution forbids disspation.
Obviouslyl we don't discuss the same thing. I'm talking about the derivation of the Boltzmann equation, which of course implies that there's dissipation through "coarse graining" and cutting the BBGKY hierarchy.
A. Neumaier said:
You postulate the microcanonical ensemble and the maximum entropy principle, and derive the grand canonical ensemble.

I postulate the grand canonical ensemble, and derive the maximum entropy principle in its correct physical context (namely for the boundary conditions where in chemistry entropy actually increases).

In many chemical processes observed in Nature entropy decreases, and the maximum entropy principle is misleading. It is not a universally valid principle!!!
I don't understand this statement. Obviously, I don't know, what you are referring to.
A. Neumaier said:
This makes my postuilates superior to yours, which cannot tell when the maximum entropy principle is applicable and when not.
I've no clue, where the maximum-entropy principle should fail in standard statistical physics and application to chemistry.
 
  • #26
vanhees71 said:
The most tricky assumption is of course the "molecular-chaos assumption" in deriving the Boltzmann equation.
... which cannot be proved but must be assumed. This is the trick involved. Thus the H-theorem does not follow from the fundamental laws alone but only from these laws plus the "molecular-chaos assumption"!
vanhees71 said:
The microcanonical distribution, i.e., the equal probability of the energy eigenstates, we discuss here, of course also follows from the maximum-entropy principle, as shown above in #2.
No. It is imposed in defining the reference probability distribution with respect to which the maximum entropy is taken.
vanhees71 said:
Obviouslyl we don't discuss the same thing. I'm talking about the derivation of the Boltzmann equation, which of course implies that there's dissipation through "coarse graining" and cutting the BBGKY hierarchy.
... plus the "molecular-chaos assumption"! This turns conservation into dissipation and cannot be derived but must be assumed! In unitary quantum mechanics, the entropy remains constant.
vanhees71 said:
I don't understand this statement. Obviously, I don't know, what you are referring to.

I've no clue, where the maximum-entropy principle should fail in standard statistical physics and application to chemistry.
Take 10 random chemical reactions at room temperature and atmospheric pressure, and work out in each case the entropy balance according to the rules of equilibrium thermodynamics. In about half of the cases, the balance is negative.

Life would be impossible if the entropy would always increase!
 
Last edited:
  • Like
Likes dextercioby, vanhees71, weirdoguy and 1 other person
  • #27
Of course, if you consider non-closed systems the entropy of this system is not necessarily increasing, but the entropy as a whole is increasing. That's the resolution of, e.g., the Maxwell-demon paradox and its quantum relatives. There are experiments with semi-conductor quantum dots, cavities, superconductivity qbits, etc., demonstrating the correctness of the information-theoretical interpretation of (thermodynamic) entropy, i.e., that the information content of a qbit is ##k_{\text{B}} \ln 2##:

https://doi.org/10.1073/pnas.1704827114
https://doi.org/10.1103/PhysRevLett.115.260602
 
  • #28
vanhees71 said:
if you consider non-closed systems the entropy of this system is not necessarily increasing, but the entropy as a whole is increasing.
But the only whole is the universe, as we know from decoherence theory. And whether entropy can be sensibly defined for the whole universe is questionable, as it is very likely infinite.

Since every bounded system that we can study is non-closed, the entropy of every such system is not necessarily increasing.
 
Last edited:
  • Like
Likes dextercioby, Fra, vanhees71 and 1 other person
  • #29
Of course, you are right in principle. If we could use the physical laws about closed systems only in application to the "whole universe" (which is unobservable anyway), we'd never have found these laws as we know them. Maybe we had figured out something completely different since there were no approximately closed systems to observe.
 
  • #30
vanhees71 said:
It's very fundamental. Of course, there's the caveat that the interactions must fall off quickly enough. The standard Boltzmann-Gibbs statistics thus does not apply to gravitating systems, and that's why there is structure in the universe.
It's my understanding that there are efforts underway to generate even the effects of gravity in a maximum-entropy kind of way, as though mass distributions make it more likely for spacetime to be nonuniform rather than uniform. I don't know much about those efforts, but the general idea seems to be to end up with a description of physics that contains only one law: that which happens in complex systems is whatever happens in a vastly larger possible number of ways. So I agree with your general point that this is a "fundamental" kind of truth to seek-- we would like to be able to reduce physics to nothing more than counting possibilities, and see which classes of behavior contain by far the most accessible members.

It's ironic, for sure-- going from Newton's picture that everything happens for a reason (i.e., some causative force law), to saying there aren't reasons like that, the force laws are also the things that happen in the most number of ways, and hence the universe is merely seeking equilibria that maximize entropy within the constraints. This I believe was Boltzmann's huge insight, though we must bear in mind that it did little good for his fragile mental health! It can lead one to ask, "what is the point," but of course, the point is that sometimes a maximum entropy state can be one that generates great triumphs in art, music, and sentient experience.
 

Similar threads

Replies
8
Views
2K
Replies
1
Views
840
Replies
12
Views
2K
Replies
27
Views
1K
Replies
16
Views
2K
Replies
9
Views
1K
Back
Top