Connection between the two definitions of entropy

In summary, the canonical ensemble provides the connection you need between the number of states and the thermodynamic entropy.
  • #1
DaTario
1,092
45
Hi All,

I would like to know how can one connect the two definitions of entropy
##\Delta S = \int_{T_i}^{T_f} \frac{dQ}{T} ## and ##\Delta S = k_B \ln (\frac{W_f}{W_i})##,
particularly I am interested in how the logarithm emerges. Does it have to do with some linear dependence of the heat with T?
DaTario
 
  • Like
Likes Delta2
Science news on Phys.org
  • #3
DaTario said:
Summary:: Hi All, I would like to know how can we connect the two definitions of entropy
##\Delta S = \int_{T_i}^{T_f} \frac{dQ}{T} ## and ##\Delta S = k_B \ln (\frac{W_f}{W_i})##
I have severe doubts that one can formally connect the Clausius entropy equation to the Boltzmann entropy equation, as the Boltzmann entropy equation actually holds only for systems in the so called microcanonical ensemble.

EDIT: An association between the Clausius entropy and the results form statistical phyiscs can be found when considering systems in the so called canonical ensemble.
 
Last edited:
  • Like
Likes Chestermiller
  • #5
As in above replies, the canonical ensemble provides the connection you need between the number of states ##W## and the thermodynamic entropy ##\int \frac{dQ}{T}##. My personal favorite introduction to the subject is Schrodinger's book "Statistical Thermophysics". It's cheap (<$10) and short.
 
  • Like
Likes DaTario
  • #6
Baluncore said:
It is quite interesting that thermodynamics definition has an integral with ##T^{-1}## inside and the Boltzmann entropy has a logarithm. I guess the connection is not likely to be so simple. If ##dQ## is always proportional to ##dT##, then the integral yields a log function. But what is the countable physical parameter in ##\int dQ/T##?

Perhaps, an explanation restricted to the case of a 1 mol sample of ideal gas would be a very good start.
 
Last edited:
  • #7
DaTario said:
But what is the countable physical parameter in ∫dQ/T?
dQ is the change in the internal energy of the system due to heating. As the energy of the system is increased, the molecules can begin to occupy higher energy states. For example, as you raise the temperature a gas, you will find more molecules in high velocity states than before. Since more states become available (W increases, per Boltzmann definition), the entropy increases. To answer your question, the countable parameter is still W, the number of accessible microstates.

DaTario said:
Perhaps, an explanation restricted to the case of a 1 mol sample of ideal gas would be a very good start.
The canonical ensemble says that the probability of finding the gas in a microstate with energy E is proportional to ##e^{-E/kT}##. Since ##p(E) \propto e^{-E/kT}## and we know that probabilities have to sum to 1 (##\int p(E) = 1##), we know that ##p(E) = \frac{1}{Z} e^{-E/kT}## where ##Z = \int e^{-E/kT}##. For a single gas particle, we know the kinetic energy is ##\frac{1}{2} m v^2##, so $$Z = \int d^3 x \int d^3 v e^{-mv^2 / 2kT} = V\int d^3 v e^{-mv^2 / 2kT}$$ If you evaluate this integral, it gives ##Z = \frac{V}{\lambda^3}## where the constant ##\lambda = \sqrt{\frac{m}{2\pi kT}}## is the de Broglie wavelength of that particle. To get the partition function for N particles, take ##Z_N = Z^N = \left( \frac{V}{\lambda^3} \right) ^N##. Using some trickery from the canonical ensemble, we have $$S = \frac{\partial}{\partial T} (kT \ln Z_N) = k \ln Z_N + kT \frac{\partial \ln Z_N}{\partial T}$$ Since internal energy is given by ##U = -\frac{\partial }{\partial \beta} \ln Z_N = \frac{3}{2} NkT## and since ##\frac{\partial}{\partial \beta} = -kT^2 \frac{\partial}{\partial T}##, we have that $$S = k \ln Z_N + \frac{U}{T} = kN \left[\ln \left( \frac{V}{\lambda^3} \right) + \frac{3}{2} \right] \approx kN \ln \left( \frac{V}{\lambda^3} \right) $$

Notice that this last term is essentially the number of de Broglie wavelength-sized cubes you could stuff into a volume V, and this is essentially a number of microstates.

Note: all the ensembles have slight differences in the additive constant on the entropy. I forget which is the most accurate, but I would trust the Sackur-Tetrode equation (from the microcanonical ensemble).
 
  • Like
Likes DaTario and Delta2
  • #8
anuttarasammyak said:
I think you had better read good texts to learn because we need some lines of mathematics.
Sorry! In case you need some lines of mathematics, please read section 1-4 "Canonical ensemble and thermodynamics" in Terrell L. Hill's book "An Introduction to Statistcal Thermodynamics".
 
  • Like
Likes DaTario
  • #9
Thank you all. Special thanks to Twigg for these lines of mathematics. Reading all these comments, it came into my mind that the equivalence of these two definitions in a certain sense seems to point to a quantization procedure as Clausius definition is continuous in principle whereas Boltzmann is discrete.
 
  • Like
Likes Twigg
  • #10
DaTario said:
it came into my mind that the equivalence of these two definitions in a certain sense seems to point to a quantization procedure
That procedure is the microcanonical ensemble
 
  • #11
Twigg said:
As in above replies, the canonical ensemble provides the connection you need between the number of states ##W## and the thermodynamic entropy ##\int \frac{dQ}{T}##. My personal favorite introduction to the subject is Schrodinger's book "Statistical Thermophysics". It's cheap (<$10) and short.
Shame on me, when I was searching for it at Amazon's site and I saw its cover, I realized that I had it in my book shelf. Thank you for the indication anyway.
 
  • #12
I don't have it on me, but IIRC it doesn't cover the microcanonical ensemble. For that, you probably want a video lecture, because some of the geometrical arguments don't make sense without pictures (at least not to my pea brain).
 
  • Like
Likes DaTario

FAQ: Connection between the two definitions of entropy

What is the definition of entropy in physics?

The definition of entropy in physics is a measure of the disorder or randomness in a system. It is often described as the tendency of a system to move from a state of order to a state of disorder.

What is the definition of entropy in information theory?

The definition of entropy in information theory is a measure of the uncertainty or unpredictability in a system. It is often described as the amount of information needed to describe the state of a system.

How are the two definitions of entropy related?

The two definitions of entropy are related in that they both measure the level of disorder or randomness in a system. In physics, entropy is related to the amount of energy that is unavailable to do work, while in information theory, entropy is related to the amount of information that is needed to describe a system.

Why is entropy important in both physics and information theory?

Entropy is important in both physics and information theory because it helps us understand and predict the behavior of complex systems. In physics, entropy is a fundamental concept in thermodynamics and helps us understand how energy flows in a system. In information theory, entropy is used to measure the amount of information in a message and is essential for data compression and transmission.

How can the concept of entropy be applied in different fields of science?

The concept of entropy can be applied in various fields of science, including physics, chemistry, biology, and computer science. In physics, entropy is used to understand the behavior of thermodynamic systems. In chemistry, it is used to explain chemical reactions and the direction of spontaneous processes. In biology, entropy is used to study the complexity and organization of living systems. In computer science, entropy is used to measure the randomness of data and improve data compression algorithms.

Back
Top