What Are the Differences Between Boltzmann and Gibbs Entropies?

In summary: Thank you for your clarification!Just that I don't think it's correct to say that ##S_{tot} = S_{G} - S_{micro}##.
  • #1
WWCY
479
12
Hi everyone, I have a few questions I'd like to ask regarding what I have read/heard about these two definitions of entropy. I also believe that I have some misconceptions about entropy and as such I'll write out what I know while asking the questions in the hope someone can correct me. Thanks in advance!

Here is what I think I know about entropy:

For a microcanonical ensemble at thermal equilibrium and with fixed energy E, it's entropy is given by ##S_B = k_b \log \Omega##, with ##\Omega## the number of microstates associated with the fixed macrostate with energy E. However, systems are rarely isolated and are usually in contact with some sort of heat bath. If we use the idea of a canonical ensemble (of a system in contact with a heat bath), we are able to calculate the entropy while taking into account the energy fluctuations. The total entropy of the system is then
$$S_{tot} = k_B \log N$$
where N is the total number of accessible microstates. However, this quantity is not measurable as we are unable to measure the entropy associated with the freedom to occupy microstates. This quantity is defined by the following expression
$$S_{micro} = \sum_{i} S_i P_i$$
where ##S_i## is the entropy associated with macrostate ##i## and ##P_i## is the probability that the system is found in the aforementioned macrostate. The actual measurable quantity is the Gibbs Entropy, defined by the total entropy less ##S_{micro}##. This is given by the expression
$$S_{G} = S_{tot} - S_{micro} = - k_B \sum_i P_i \log P_i$$

My questions are:

1. Did I get the description of the various definitions of entropy right?
2. Why are ##S_B##, ##S_{tot}## and ##S_{micro}## considered as unmeasurable quantities? And what does it mean to "measure" entropy?
3. Why, and how are we able to measure the quantity ##S_{G}##?
4. Does ##S_{G}## also tend to maximise itself (with respect to appropriate constraints) in thermal equilibrium?
5. Temperature is defined by ##1/T = \partial _E S_B## in the case of a microcanonical ensemble. Can I say something similar for the Gibbs entropy? ie ##1/T \propto \partial _E S_G##

Many thanks!
 
Physics news on Phys.org
  • #2
Since nobody else has responded, I will give my two cents. I don't feel that the connection between ##S_G, S_{tot}, and S_{micro}## is quite right.

The Gibbs entropy is I think a generalization of ##S_B##. The latter is the special case where ##p_i = \frac{1}{\Omega}## (all microstates are equally likely).

You can't define a temperature for an arbitrary probability distribution using ##S_G##, because ##S_G## is not necessarily a function of ##E##, unless you explicitly say how the probabilities ##p_i## depend on energy.

I'm not sure where your expression for ##S_{micro}## is coming from.
 
  • #3
stevendaryl said:
Since nobody else has responded, I will give my two cents. I don't feel that the connection between ##S_G, S_{tot}, and S_{micro}## is quite right.

Do you mind elaborating on what is "not quite right"?

stevendaryl said:
The Gibbs entropy is I think a generalization of ##S_B##

Why would it be considered a generalisation though? Wasn't the ##S_G## derived with a canonical ensemble in mind, while the ##S_B## was derived with the assumption that the system was at fixed ##E##?

As for the expression for ##S_{micro}##, I'll try to get back to you on that the moment I get access to the text I was referencing (Concepts in Thermal Physics, Blundell).

Thank you for your assistance!
 
  • #4
WWCY said:
Do you mind elaborating on what is "not quite right"?

Just that I don't think it's correct to say that ##S_{tot} = S_{G} - S_{micro}##.

Why would it be considered a generalisation though? Wasn't the ##S_G## derived with a canonical ensemble in mind, while the ##S_B## was derived with the assumption that the system was at fixed ##E##?

The definition ##S_B = k log(\Omega)## is the special case of ##S_G = - k \sum_j P_j\ log(P_j)## when ##P_j = \frac{1}{\Omega}##. That's the probability distribution in which every state with the same energy is considered equally likely.
 
  • #5
I'm with stevendaryl in not understanding where the expression ##S_{micro}## in the original post comes from. To elaborate a little on the Gibbs formula (the following is mostly my own restatements of stevendaryl's points):

The microcanonical ensemble is just one particular choice of ensemble, and the definition of entropy there, ##S = k_B \log \Omega## (##\Omega## is number of microstates), is unique to that ensemble.

But one is free to consider different ensembles, like the canonical or grand canonical ensemble, and consider the associated entropy there. The usefulness of the Gibbs entropy formula is that it reduces to the correct expression for the entropy in every ensemble. It is a totally general definition of an "entropy" in a probability distribution, and is more fundamental than the expression in a particular ensemble.
 
  • Like
Likes WWCY
  • #6
Thanks for the responses!

I can see that the Gibbs formula is capable of describing more general situations, but one thing I can't understand is this

stevendaryl said:
The definition ##S_B = k log(\Omega)## is the special case of ##S_G = - k \sum_j P_j\ log(P_j)## when ##P_j = \frac{1}{\Omega}##. That's the probability distribution in which every state with the same energy is considered equally likely.

##P_j## in general describes the probability of the system occupying macrostate J. If, say, we label these macrostates by energies ##E_j## that are not equal, we can't say that the equation ##S_G = - k \sum_j P_j\ log(P_j) = k log(\Omega)## describes the entropy associated with a given macrostate with fixed energy, can we? It seems to me that this only "looks" like the Boltzmann entropy, though it actually describes a system with equally probable macrostates rather than microstates.
 
  • #7
WWCY said:
##P_j## in general describes the probability of the system occupying macrostate J.

Well, in my comment, I was talking about the probability of the system being in a particular MICROSTATE. The assumption (or maybe it's true by definition) behind the formula ##S = k ln \Omega## is that you fix E, V, N (the total energy, number of particles and volume). Then the assumption is that every microstate with that same E, V, N are equally likely. Then the probability of being in a particular state ##j## is just ##\frac{1}{\Omega}##, where ##\Omega## is the number of states with that given E, V, N. In that case: ##\sum_j p_j ln(\frac{1}{p_j}) = \sum_j \frac{1}{\Omega} ln(\Omega) = ln(\Omega)##

In that analysis, the probability of being in a particular MACROSTATE is 1 (since the macrostate is defined by E,V,N).

I guess you could also talk about the entropy of a situation where there is a probability ##P_j## of being in a particular macrostate ##j##, as well. In that case, the entropy is ... okay, I think I understand where your ##S_{micro}## is coming from! Sorry.

Let ##P(j)## be the probability of being in macrostate ##j##. Let ##P(\mu | j)## be the conditional probability of being in microstate ##\mu##, given that the system is in macrostate ##j##. Then the probability of being in macrostate ##j## and microstate ##\mu## is given by:

##P(j) P(\mu | j)##

The associated entropy is given by:

##S = k \sum_j \sum_\mu P(j) P(\mu | j) log(\frac{1}{P(j) P(\mu | j)}) ##
##= k \sum_j \sum_\mu P(j) P(\mu | j) log(\frac{1}{P(j)}) + k \sum_j \sum_\mu P(j) P(\mu | j) log(\frac{1}{P(\mu | j)})##

(where I've used the property of logarithms that ##log(\frac{1}{XY}) = log(\frac{1}{X}) + log(\frac{1}{Y})##)

For the first term on the right side of the equals, the sum over ##\mu## can be done to give 1. (Since ##\sum_\mu P(\mu | j) = 1##).
For the second term on the right side, we can write ##S_j = k \sum_\mu P(\mu | j) log(\frac{1}{P(\mu | j)})##. So the expression simplifies to:

##S = k \sum_j P(j) log(\frac{1}{P(j)}) + \sum_j P(j) S_j##
## = S_{macro} + S_{micro}##

So I now understand your original post. Sorry for taking so long.

But this is not ##S_G = S_{macro} + S_{micro}##. All three are using the formula ##S = k \sum_j P_j log(\frac{1}{P_j})##, but for different probabilities.

If, say, we label these macrostates by energies ##E_j## that are not equal, we can't say that the equation ##S_G = - k \sum_j P_j\ log(P_j) = k log(\Omega)## describes the entropy associated with a given macrostate with fixed energy, can we? It seems to me that this only "looks" like the Boltzmann entropy, though it actually describes a system with equally probable macrostates rather than microstates.

I think I was confused by the micro versus macro language. If you have a system at constant temperature, then I would say that in that case, the macrostate is determined by the temperature, rather than by the energy.

But in a certain sense, you have two levels of micro- versus macro- going on. There is one level where you specify T and let the energy be uncertain. There is another level where you specify E and let the microstate be uncertain.
 
  • Like
Likes WWCY

FAQ: What Are the Differences Between Boltzmann and Gibbs Entropies?

1. What is the difference between Boltzmann and Gibbs entropy?

Boltzmann entropy is used to describe the disorder or randomness of a system at a microscopic level, while Gibbs entropy takes into account both the microscopic and macroscopic properties of a system.

2. How do Boltzmann and Gibbs entropy relate to thermodynamics?

Both Boltzmann and Gibbs entropy are important concepts in thermodynamics, as they help to quantify the amount of disorder and randomness in a system. They are also used to calculate the change in entropy in a thermodynamic process.

3. Can Boltzmann and Gibbs entropy be negative?

Yes, both Boltzmann and Gibbs entropy can be negative. This indicates that the system has a lower level of disorder or randomness compared to a reference state.

4. How is Boltzmann and Gibbs entropy calculated?

Boltzmann entropy is calculated using the formula S = k ln(W), where S is the entropy, k is the Boltzmann constant, and W is the number of microstates in the system. Gibbs entropy is calculated using the formula S = -k Σ p ln(p), where S is the entropy, k is the Boltzmann constant, and p is the probability of each microstate.

5. What are some real-world applications of Boltzmann and Gibbs entropy?

Boltzmann and Gibbs entropy are used in various fields such as physics, chemistry, and biology to understand and predict the behavior of complex systems. They are also used in information theory and statistical mechanics to study the properties of systems with a large number of particles.

Similar threads

Back
Top