Boltman interpretation of entropy

In summary: In order to make sense of entropy, we need to look at it from a slightly more complicated perspective.In statistical mechanics, we're typically interested in the behaviour of systems over time. So, for example, we might be interested in what happens when a gas is heated up- do the molecules start moving around more randomly, or stick more closely to each other? The answer to this question (which is actually a pretty complex question, involving things like statistical thermodynamics and the Second Law of Thermodynamics) involves looking at the entropy of the gas as a function of temperature. So, in summary, entropy is a measure of the "disorder"
  • #1
oddiseas
73
0
we have been learning ststisticsl physics for the last month and the lecturer has still not explained what entropy is.Other than say "this is the boltman interpretation of entropy"
all the info i find on the web seems to say different things for different situations so i was hoping someone could explain this concept to me because i still don't have a clue what it is, and most of the calculations we are doing involve entropy
 
Physics news on Phys.org
  • #2


Well, entropy is kind of a tricky thing to really understand. But to get you started: it's a measure of the "disorder" of a system. What that really means is, entropy is related to the number of configurations a system could have given its energy, temperature, pressure, and volume.

Very artificial example: two coins sitting on a table. Each one can be heads up or tails up, so there are a total of four possibilities (HH, HT, TH, TT) for the two coins. In this case the system's entropy would be basically log(4). (Of course the coins could also be standing on their side, or spinning, but that would require more energy so those states don't count towards the entropy) If you had three coins, there would be 8 possible states with the same energy, so the entropy would be basically log(8). And so on.

Realistic example: you have a balloon filled with air. The balloon, along with the air inside it, has a certain temperature, a certain pressure, a certain volume; all of these things can be measured. For a gas, a "state" is defined by the positions, momenta, and angular momenta of all the individual particles (molecules). Obviously there are a large* number of ways the molecules could arrange themselves inside the balloon, and a large* number of possibilities for how they could be moving in there, so there are a very large* number of possible states of the gas. You could (theoretically) calculate the temperature, pressure, and volume (and energy) that would be produced by each of those states, and throw out all the ones that don't match the measured properties of the balloon, and count how many are left. That number is called the multiplicity, denoted Ω. And the entropy is basically log(Ω).

*"large" = "UNIMAGINABLY INCOMPREHENSIBLY HUGE" if you want to get technical :wink:
 
  • #3


Actually entropy is simple. There is a 1-to-1 relation to the probability of the state (provided you have left enough time to cycle through all intermediate states).
 
  • #4


There's two basic, related definitions of entropy. I'm probably about to post a related question myself, so I'm no expert, but I hope you find the following helpful.

The term was first used in the field of classical thermodynamics. Quick summary:
A system doesn't possesses "work" or "heat"- these are labels that we assign to different methods of transferring energy into or out of a system. This means that if some body undergoes a cyclic process, so that at the end of the process it's in exactly the same state it started out from, then you can add up all the heat that's transferred into and out from it and not get zero; any net energy that flowed into the system as heat might be spent doing work, leaving the body with the same total energy at the end. Mathematically speaking, the integral of the increments dQ around a closed loop is non-zero. However, it turns out the integral [tex]\oint\frac{dQ}{T}=0[/tex] identically, where T is the temperature of the body at which the heat is transferred. This means that we can define a total differential [tex]dS=\frac{dQ}{T}[/tex] of some quantity S, called the entropy, which is a function of the state of a system; in plain English, a system "has" a definite value of this 'entropy' the same way it has a definite energy.

I never really understood entropy in a purely thermodynamic context, and still don't. It makes more sense (to me, at least) from the viewpoint of statistical mechanics.

To a first approximation, it's helpful to start out with the idea that diazona posted below: the entropy is the logarithm of the number of "states" of the system, by which we mean "microstates". What you mean by this precisely depends on whether or not you're talking about classical or quantum mechanics. Classically, a system is specified by a point in phase space (if you've encountered the concept?); and quantum mechanically, by a specification of the components of the state vector with respect to some basis (or, at a simpler level, by a wavefunction). So you fix the energy of your system, and see how many ways you can configure the constituents of that system so that in total it has that energy.

At this simple level, however, it doesn't make any sense for a system to "maximise its entropy"- the paragraph above counts the total number of ways it's physically possible to share out the a given amount of energy among the components of a system; that's a fixed number. Instead, we have to think about the distinction between a "microstate"- a complete specification of the system- and a "macrostate"- a specification of all the numbers like temperature, pressure, volume etc. that we can actually measure.

To understand the relation between the two, think about a sequence of 100 coin tosses. A complete specification of the sequence would consist of the entire sequence of heads and tails; a "macrostate" would consist of saying how many were heads and how many tails. If you actually sat and tossed a coin 100 times, the odds are slim that you'd get 100 heads, even though this sequence is no less likely than any given sequence HTHTHTTH...HT that contains 50 heads. The odds are extremely high, in fact, that you'd get 50 heads, plus or minus a couple, simply because there's more of those sequences. The essential point is that you're picking out the macrostate by looking for that which corresponds to the greatest number of microstates. The number of microstates corresponding to that macrostate is what diazona called the multiplicity, the number [tex]\Omega[/tex] of which the logarithm is the entropy.

Finally, it's worthwhile commenting on why we take the logarithm. It means that if we have two systems, with multiplicities [tex]\Omega_1[/tex] and [tex]\Omega_2[/tex], then the multiplicity of the combined system is [tex]\Omega_1\cdot \Omega_2[/tex], so that the total entropy is the sum of the entropies of the two systems.

Sorry for the lengthy post, but I hope it helped!
 
  • #5


Informational entropy is a mathematical book-keeping approach that allows one to reason about the amount of disorder in the system. As stated this is closely related to statistical mechanics and Boltzmann.

Thermodynamic entropy is a way of evaluating how much of some energetic behavior adds to the disorder based on notions of the temperature of the system. This reduces statistical mechanics to some simple rules that mechanical engineers and others can apply.

I agree that the latter is much harder to wrap one's mind around. Look up the publicly available works of E.T. Jaynes on entropy as he tries to reconcile the two approaches, and bridges the gap to uses of entropy in other disciplines, such as Shannon's information theory.
 
  • #6


WHT said:
Informational entropy is a mathematical book-keeping approach that allows one to reason about the amount of disorder in the system. As stated this is closely related to statistical mechanics and Boltzmann.

Thermodynamic entropy is a way of evaluating how much of some energetic behavior adds to the disorder based on notions of the temperature of the system. This reduces statistical mechanics to some simple rules that mechanical engineers and others can apply.

I agree that the latter is much harder to wrap one's mind around. Look up the publicly available works of E.T. Jaynes on entropy as he tries to reconcile the two approaches, and bridges the gap to uses of entropy in other disciplines, such as Shannon's information theory.

Shannon's entropy is:

S = -the sum over states of (P_state) times the logarithm of (P_state)

Boltzmann's entropy is:

S = (Boltzmann's constant) times the logarithm of the number of states


And Shannon's reduces to Boltzmann's for uniform probability distributions (P_states)...
is that not so? Ie, Shannon's is an extension of Boltzmann's for non-uniform
probability distributions. This seems to be the connection.
 
  • #7


ClamShell said:
And Shannon's reduces to Boltzmann's for uniform probability distributions (P_states)...
is that not so? Ie, Shannon's is an extension of Boltzmann's for non-uniform
probability distributions. This seems to be the connection.
I thought the same, but when I looked at the details of the definitions, I found out that this notion is conceptually wrong.
It's because Boltzmann entropy is the logarithm of [the number of microstates within one particular macrostate].

So if you have macrostates i=1...N and each macrostate has O_i microstates, then the entropy of one particular macrostate is S_i=ln(O_i)
The probabilities are p_i=O_i/O where O=sum O_i

What you probably were thinking of is that for a uniform distribution p_i=1/O and therefore S=-ln O, but you see that's not the definition of Boltzmann entropy and O and O_i are different things. So this "proof" doesn't make sense.

In fact, it's the other way round! The Shannon entropy is a special case of the Boltzmann entropy, if you restrict yourself to multinomial distributions.
 
  • #8


Gerenuk said:
In fact, it's the other way round! The Shannon entropy is a special case of the Boltzmann entropy, if you restrict yourself to multinomial distributions.

As defined the Boltzmann entropy can't really be maximized as it keeps growing with additional states. The utility of Shannon is that you can apply constraints to the probability distribution in terms of moments and limits and out pops the Maximum Entropy Principle. This allows you to reason about all sorts of physics, which is the point that Jaynes tried to address.

Perhaps Boltzmann entropy is just too general to be practically useful?
 
  • #9


I thought the same, but when I looked at the details of the definitions, I found out that this notion is conceptually wrong.
I'm not sure that's right. The index i in Gibb's entropy formula indexes microstates equivalent to a macrostate, not macrostates themselves.
 
  • #10


Tomsk said:
I'm not sure that's right. The index i in Gibb's entropy formula indexes microstates equivalent to a macrostate, not macrostates themselves.
So what do you call microstate and what do you call macrostate then? Maybe I'm mixing up the names. Let's take an example:
A gas has a certain volume V (macrostate) and each volume has many different realizations (microstates). I use the equation to calculate the entropy for one particular volume V.

So do you sum over realizations for one volume or do you sum over all volumes? In the latter case your result does not depend on volume, so what is it good for?
 
  • #11


Gerenuk said:
So do you sum over realizations for one volume or do you sum over all volumes? In the latter case your result does not depend on volume, so what is it good for?

Isn't that one of the fundamental issues. We want entropy to be an intensive property of matter but the definitions allow it to show extensive traits. I think that is one of the reasons that some scientists are fiddling with other nonextensive formulations such as Tsallis entropy.
 
  • #12


WHT said:
Isn't that one of the fundamental issues. We want entropy to be an intensive property of matter but the definitions allow it to show extensive traits.
Why do we want it to be intensive? It should rather be extensive. The corresponding intensive quantity is temperature.

WHT said:
I think that is one of the reasons that some scientists are fiddling with other nonextensive formulations such as Tsallis entropy.
I'd rather say it's because Shannon entropy is just a special case. But sometimes more general approaches fit experimental data better. In any case the Boltzmann definition is the most general and so easy, that it cannot be wrong.
 
  • #13


Gerenuk said:
Let's take an example:
A gas has a certain volume V (macrostate) and each volume has many different realizations (microstates). I use the equation to calculate the entropy for one particular volume V.
OK that sounds right

So do you sum over realizations for one volume or do you sum over all volumes? In the latter case your result does not depend on volume, so what is it good for?
You'd sum over realizations for one volume. The probabilities are probabilities of finding a particular realization of the volume V, i.e. they all correspond to the same volume. A microstate might be a list of positions of all the particles: {x_1 ... x_N}, then the Gibbs entropy is

[tex]S(V) = -\sum_{\{x_1 ... x_N\}} p(\{x_1 ... x_N\}) \log{ p(\{x_1 ... x_N\}) }[/tex]
Where the sum runs over all possible lists {x_1 ... x_N} such that [itex]vol(\{x_1 ... x_N\})=V[/itex]. Although I'm not even sure you need that, you could just say that if a particular list has [itex]vol(\{x_1 ... x_N\}) \neq V[/itex], then [itex]p(\{x_1 ... x_N\})=0[/itex] and use 0log0 = 0.

The microcanonical ensemble is then [itex]p(\{x_1 ... x_N\}) = 1/O_V[/itex] for all lists with volume V, which gets you back to the Boltzmann entropy.
 
  • #14


Tomsk said:
OK that sounds right


You'd sum over realizations for one volume. The probabilities are probabilities of finding a particular realization of the volume V, i.e. they all correspond to the same volume. A microstate might be a list of positions of all the particles: {x_1 ... x_N}, then the Gibbs entropy is

[tex]S(V) = -\sum_{\{x_1 ... x_N\}} p(\{x_1 ... x_N\}) \log{ p(\{x_1 ... x_N\}) }[/tex]
Where the sum runs over all possible lists {x_1 ... x_N} such that [itex]vol(\{x_1 ... x_N\})=V[/itex]. Although I'm not even sure you need that, you could just say that if a particular list has [itex]vol(\{x_1 ... x_N\}) \neq V[/itex], then [itex]p(\{x_1 ... x_N\})=0[/itex] and use 0log0 = 0.

The microcanonical ensemble is then [itex]p(\{x_1 ... x_N\}) = 1/O_V[/itex] for all lists with volume V, which gets you back to the Boltzmann entropy.

Still looks like Boltzmann's entropy is Shannon's(Gibb's) entropy for a uniform
distribution of probabilities...sometimes dyslexia is a good thing.
 
  • #15


Gerenuk said:
Why do we want it to be intensive? It should rather be extensive. The corresponding intensive quantity is temperature.

Entropy composes as
S(A+B) = S(A) + S(B)
which indeed does make it perfectly extensive. Tsallis proposed composition as
S(A+B) = S(A) + S(B) +(1-q)S(A)S(B)
This makes it nonextensive for q<>1, which now that I think about it is not the same as making it intensive.
 
  • #16


WHT said:
Look up the publicly available works of E.T. Jaynes on entropy as he tries to reconcile the two approaches, and bridges the gap to uses of entropy in other disciplines, such as Shannon's information theory.

Thanks for that. Some of his papers are available from his wikipedia page; they look to be interesting reading.
 
  • #17


Tomsk said:
You'd sum over realizations for one volume. The probabilities are probabilities of finding a particular realization of the volume V, i.e. they all correspond to the same volume. A microstate might be a list of positions of all the particles: {x_1 ... x_N}, then the Gibbs entropy is

[tex]S(V) = -\sum_{\{x_1 ... x_N\}} p(\{x_1 ... x_N\}) \log{ p(\{x_1 ... x_N\}) }[/tex]
I thought about it for a while and I think you are using an inappropriate interpretation of p_i.
For example compare it to the derivation of the Bose-distribution. At the beginning one writes sum pln(p) -> max where p_i is the ratio of particles in state i considering that *one* particular microstate. That's conceptually something different, than what you have for p_i.
With your definition the Fermi-Bose distribution derivation wouldn't work out?!
 
  • #18


Oh, on the second thought you were actually right. I confused the p_i.

Maybe another way of counting microstates is better. You could say they have a multiplicity proportional to their probabilities. This view would make the second law a simple probabilistic statement.
 
  • #19


Can anyone please explain why

[tex]S=- \ln\sum_{i=1}^N P_i^n[/tex] for some integer n

(for microcanonical ensemble). Thanks.
 
  • #20


WHT said:
Informational entropy is a mathematical book-keeping approach that allows one to reason about the amount of disorder in the system. As stated this is closely related to statistical mechanics and Boltzmann.

Thermodynamic entropy is a way of evaluating how much of some energetic behavior adds to the disorder based on notions of the temperature of the system. This reduces statistical mechanics to some simple rules that mechanical engineers and others can apply.

I agree that the latter is much harder to wrap one's mind around. Look up the publicly available works of E.T. Jaynes on entropy as he tries to reconcile the two approaches, and bridges the gap to uses of entropy in other disciplines, such as Shannon's information theory.

See also Sections 7.6 and Section 7.7 of the online book

A. Neumaier and D. Westra,
Classical and Quantum Mechanics via Lie algebras.
http://de.arxiv.org/abs/0810.1019
 

FAQ: Boltman interpretation of entropy

What is the Boltzmann interpretation of entropy?

The Boltzmann interpretation of entropy is a statistical approach to understanding the disorder or randomness of a system. It states that the entropy of a system is directly proportional to the number of possible microstates that the system can have. This means that as the number of microstates increases, the entropy also increases.

How does the Boltzmann interpretation of entropy relate to the Second Law of Thermodynamics?

The Boltzmann interpretation of entropy provides a statistical explanation for the Second Law of Thermodynamics, which states that the total entropy of a closed system will always increase over time. This means that systems tend towards disorder and randomness, as predicted by the Boltzmann interpretation.

What is the significance of the Boltzmann constant in the Boltzmann interpretation of entropy?

The Boltzmann constant (denoted by k) is a fundamental constant in physics that relates the average kinetic energy of particles in a system to its temperature. In the Boltzmann interpretation of entropy, the Boltzmann constant is used to convert between energy and temperature units in the equation for entropy.

How does the Boltzmann interpretation of entropy apply to different types of systems?

The Boltzmann interpretation of entropy can be applied to a wide range of systems, from gases and liquids to solids and even complex systems like living organisms. In all cases, the entropy of a system is related to the number of possible microstates it can have, and the Second Law of Thermodynamics still holds true.

What are the limitations of the Boltzmann interpretation of entropy?

One limitation of the Boltzmann interpretation is that it is based on statistical probabilities and does not take into account the specific arrangement or interactions of particles within a system. It also does not explain the origin or direction of the Second Law of Thermodynamics, but rather provides a statistical explanation for its effects.

Similar threads

Replies
3
Views
2K
Replies
3
Views
1K
Replies
53
Views
4K
Replies
4
Views
909
Replies
87
Views
4K
Replies
3
Views
840
Replies
28
Views
3K
Back
Top