- #1
Kcant
- 3
- 0
I've been refreshing myself on some of the statistical mechanics I learned a couple years ago, using Kittel and Kroemer as a guide. However, I've come across a couple things that bother me:
1. When the Boltzmann distribution is derived, no real physics enters the picture. Essentially, the Taylor expansion of the entropy function is used to find the relative probability that two states are occupied. But entropy is just the logarithm of the degeneracy function, so why not just Taylor-expand the degeneracy function itself? What makes entropy special? I know that the degeneracy function is generally a fast-varying function of energy, and that entropy varies much more smoothly, but how can you know this a priori? How do you know that entropy varies sufficiently slowly to accurately approximate probabilities, and that you don't need some higher-order logarithm?
2. When deriving the Fermi-Dirac statistics, why can the energy of an unoccupied state be taken to be zero? Don't you really need some kind of field theory to know that?
1. When the Boltzmann distribution is derived, no real physics enters the picture. Essentially, the Taylor expansion of the entropy function is used to find the relative probability that two states are occupied. But entropy is just the logarithm of the degeneracy function, so why not just Taylor-expand the degeneracy function itself? What makes entropy special? I know that the degeneracy function is generally a fast-varying function of energy, and that entropy varies much more smoothly, but how can you know this a priori? How do you know that entropy varies sufficiently slowly to accurately approximate probabilities, and that you don't need some higher-order logarithm?
2. When deriving the Fermi-Dirac statistics, why can the energy of an unoccupied state be taken to be zero? Don't you really need some kind of field theory to know that?