- #1
Philip Koeck
- 787
- 223
I have a rather general question about the definition of entropy used in most textbooks:
S = k ln Ω, where Ω is the number of available microstates.
Boltzmann wrote W rather than Ω, and I believe this stood for probability (Wahrscheinlichkeit).
Obviously this is not a number between 0 and 1, so it's more like something proportional to probability.
Probability would be the number of accessible microstates in a given macrostate divided by the total number of microstates (including those that are not accessible).
Now for distinguishable particles both these numbers are bigger than for indistinguishable particles, by a factor N!, where N is the number of particles, in the case of low occupancy.
Would it make sense therefore to use the following definition of entropy for distinguishable particles to make sure that this "probability" W is calculated correctly?
S = k ln (Ω / N!) for distinguishable particles at low occupancy.
This would make S extensive even for distinguishable particles.
S = k ln Ω, where Ω is the number of available microstates.
Boltzmann wrote W rather than Ω, and I believe this stood for probability (Wahrscheinlichkeit).
Obviously this is not a number between 0 and 1, so it's more like something proportional to probability.
Probability would be the number of accessible microstates in a given macrostate divided by the total number of microstates (including those that are not accessible).
Now for distinguishable particles both these numbers are bigger than for indistinguishable particles, by a factor N!, where N is the number of particles, in the case of low occupancy.
Would it make sense therefore to use the following definition of entropy for distinguishable particles to make sure that this "probability" W is calculated correctly?
S = k ln (Ω / N!) for distinguishable particles at low occupancy.
This would make S extensive even for distinguishable particles.