- #1
aaaa202
- 1,169
- 2
Entropy is in my book defined as:
S = k ln[itex]\Omega[/itex] , where [itex]\Omega[/itex] is the multiplicity of the microstates.
Now I have several questions regarding this definition (please try to answer them all! :) )
1) Why do you take the log of the multiplicity? Is it just because additivity as well as the fact that even for a lot of particles S is still a small number, just makes it a lot more convenient to work with this definition?
At first I thought so, but then ln increases less per step the farther you are on the x axis. So wouldn't that make problems?
2) I can understand that generally thermodynamics is more deeply described in statistical mechanics. Is entropy then on a deeper level still defined as the the above, or is that just something that also appears to be true? I'm asking this because, the k factor seems to indicate, that there's more to it than at first glance.
Also I have seen a few videos, which discuss entropy in information theory, and here it seems that entropy is something more deep than just an expression for the multiplicity of a system.
That covers all. Thanks :)
S = k ln[itex]\Omega[/itex] , where [itex]\Omega[/itex] is the multiplicity of the microstates.
Now I have several questions regarding this definition (please try to answer them all! :) )
1) Why do you take the log of the multiplicity? Is it just because additivity as well as the fact that even for a lot of particles S is still a small number, just makes it a lot more convenient to work with this definition?
At first I thought so, but then ln increases less per step the farther you are on the x axis. So wouldn't that make problems?
2) I can understand that generally thermodynamics is more deeply described in statistical mechanics. Is entropy then on a deeper level still defined as the the above, or is that just something that also appears to be true? I'm asking this because, the k factor seems to indicate, that there's more to it than at first glance.
Also I have seen a few videos, which discuss entropy in information theory, and here it seems that entropy is something more deep than just an expression for the multiplicity of a system.
That covers all. Thanks :)