- #1
WWCY
- 479
- 12
Hi everyone, I have a few questions I'd like to ask regarding what I have read/heard about these two definitions of entropy. I also believe that I have some misconceptions about entropy and as such I'll write out what I know while asking the questions in the hope someone can correct me. Thanks in advance!
Here is what I think I know about entropy:
For a microcanonical ensemble at thermal equilibrium and with fixed energy E, it's entropy is given by ##S_B = k_b \log \Omega##, with ##\Omega## the number of microstates associated with the fixed macrostate with energy E. However, systems are rarely isolated and are usually in contact with some sort of heat bath. If we use the idea of a canonical ensemble (of a system in contact with a heat bath), we are able to calculate the entropy while taking into account the energy fluctuations. The total entropy of the system is then
$$S_{tot} = k_B \log N$$
where N is the total number of accessible microstates. However, this quantity is not measurable as we are unable to measure the entropy associated with the freedom to occupy microstates. This quantity is defined by the following expression
$$S_{micro} = \sum_{i} S_i P_i$$
where ##S_i## is the entropy associated with macrostate ##i## and ##P_i## is the probability that the system is found in the aforementioned macrostate. The actual measurable quantity is the Gibbs Entropy, defined by the total entropy less ##S_{micro}##. This is given by the expression
$$S_{G} = S_{tot} - S_{micro} = - k_B \sum_i P_i \log P_i$$
My questions are:
1. Did I get the description of the various definitions of entropy right?
2. Why are ##S_B##, ##S_{tot}## and ##S_{micro}## considered as unmeasurable quantities? And what does it mean to "measure" entropy?
3. Why, and how are we able to measure the quantity ##S_{G}##?
4. Does ##S_{G}## also tend to maximise itself (with respect to appropriate constraints) in thermal equilibrium?
5. Temperature is defined by ##1/T = \partial _E S_B## in the case of a microcanonical ensemble. Can I say something similar for the Gibbs entropy? ie ##1/T \propto \partial _E S_G##
Many thanks!
Here is what I think I know about entropy:
For a microcanonical ensemble at thermal equilibrium and with fixed energy E, it's entropy is given by ##S_B = k_b \log \Omega##, with ##\Omega## the number of microstates associated with the fixed macrostate with energy E. However, systems are rarely isolated and are usually in contact with some sort of heat bath. If we use the idea of a canonical ensemble (of a system in contact with a heat bath), we are able to calculate the entropy while taking into account the energy fluctuations. The total entropy of the system is then
$$S_{tot} = k_B \log N$$
where N is the total number of accessible microstates. However, this quantity is not measurable as we are unable to measure the entropy associated with the freedom to occupy microstates. This quantity is defined by the following expression
$$S_{micro} = \sum_{i} S_i P_i$$
where ##S_i## is the entropy associated with macrostate ##i## and ##P_i## is the probability that the system is found in the aforementioned macrostate. The actual measurable quantity is the Gibbs Entropy, defined by the total entropy less ##S_{micro}##. This is given by the expression
$$S_{G} = S_{tot} - S_{micro} = - k_B \sum_i P_i \log P_i$$
My questions are:
1. Did I get the description of the various definitions of entropy right?
2. Why are ##S_B##, ##S_{tot}## and ##S_{micro}## considered as unmeasurable quantities? And what does it mean to "measure" entropy?
3. Why, and how are we able to measure the quantity ##S_{G}##?
4. Does ##S_{G}## also tend to maximise itself (with respect to appropriate constraints) in thermal equilibrium?
5. Temperature is defined by ##1/T = \partial _E S_B## in the case of a microcanonical ensemble. Can I say something similar for the Gibbs entropy? ie ##1/T \propto \partial _E S_G##
Many thanks!