How can you differentiate entropy

In summary, temperature is a measure of the amount of energy transferred between particles in a system. It is defined by the equation 1/T = ∂S/∂U, where S is a function of the multiplicity of particles in the system. However, this definition is weird because S is a discrete function, and the multiplicity certainly does not form a continuous basis. To overcome this problem, we differentiate S and W, which leads to the thermodynamic limit where S is continuous. Furthermore, temperature is only meaningful if entropy is also defined, and in this case it is proportional to the total amount of energy in the system.
  • #1
zezima1
123
0
The definition of temperature is:

1/T = ∂S/∂U

but mathematically this is a bit weird. Because S = klnW, where W denotes the multiplicity. And the multiplicity certainly does not form a continuous basis, but rather a set of spread out integers. So S must be a discrete function!
Therefore my question is: How come it still works? I have sometimes used the stirlingsapproximation and differentiated the expressions to find the maximum multiplicity. And indeed it did work, but mathematically it's just nonsense to me, that you can ever differentiate S or W.

And another question (If you can bear over with all of them):

The above definition of S = klnW: Is this JUST A DEFINITION because of the niceness of the additivity of the ln or is there more to it - i.e. is the ln somehow incorporated in nature?
Because I get different answers for this, most saying it's a pure definition. So could you actually just as well have defined:

1/T = ∂W/∂U (maybe multiplied by k)

and got the same result? Obviously the formulas would need to be revised such that they fit with W not being additive.
 
Science news on Phys.org
  • #2
zezima1 said:
The definition of temperature is:

1/T = ∂S/∂U

but mathematically this is a bit weird. Because S = klnW, where W denotes the multiplicity. And the multiplicity certainly does not form a continuous basis, but rather a set of spread out integers. So S must be a discrete function!
Therefore my question is: How come it still works? I have sometimes used the stirlingsapproximation and differentiated the expressions to find the maximum multiplicity. And indeed it did work, but mathematically it's just nonsense to me, that you can ever differentiate S or W.

You differentiate this "discrete" function by only considering the limit as the particle number goes to infinity. This is called the "thermodynamic limit" and is always implicit in thermodynamics and statistical physics. In this limit, the function is continuous.

And another question (If you can bear over with all of them):

The above definition of S = klnW: Is this JUST A DEFINITION because of the niceness of the additivity of the ln or is there more to it - i.e. is the ln somehow incorporated in nature?
Because I get different answers for this, most saying it's a pure definition. So could you actually just as well have defined:

1/T = ∂W/∂U (maybe multiplied by k)

and got the same result? Obviously the formulas would need to be revised such that they fit with W not being additive.

That definition would not represent what we know as "temperature." Because the definition isn't additive, it isn't an extensive quantity, and therefore wouldn't satisfy the zeroth law of thermodynamics. For example, using your definition of T, two ideal gasses, one with N particles and one with M particles (N>M) with the same T would not be in thermal equilibrium--heat would flow from one to the other.
 
  • #3
Jolb said:
You differentiate this "discrete" function by only considering the limit as the particle number goes to infinity. This is called the "thermodynamic limit" and is always implicit in thermodynamics and statistical physics. In this limit, the function is continuous.

Interesting. But even though the number of particles is infinite I don't see why the multiplicity would be continuous to be honest. The systems I have seen the multiplicity as a function of U is given by some combinatorial formula. And this will always yield integers. I don't see why setting N->∞, that you would not get integers.

Jolb said:
That definition would not represent what we know as "temperature." Because the definition isn't additive, it isn't an extensive quantity, and therefore wouldn't satisfy the zeroth law of thermodynamics. For example, using your definition of T, two ideal gasses, one with N particles and one with M particles (N>M) with the same T would not be in thermal equilibrium--heat would flow from one to the other.

Hmm.. I think you're on to something interesting here. But can you please elaborate: Why would the additivity do that the system is in thermal equilibrium while a non additive definition wouldn't?
 
  • #4
With N->infinity, you have to replace all numbers by some fractions, like multiplicity per particle or similar things, in order to keep your values finite. And these fractions can approach any real value (maybe with some physical boundaries) as limit.
 
  • #5
Oops, I made a typo in my above post. I should have said
Because the definition S=∂W/∂U isn't additive, S isn't an extensive quantity,...
.

Let me clarify. There are (at least) two ways of defining temperature and entropy.
I) In thermodynamics, the zeroth law leads to the existence of the temperature T, so we start with T. Entropy is defined in terms of temperature as

dS = dQ/T where dQ is the inexact differential of the heat exchange during a reversible change.

This leads to
1/T = ∂S/∂U at constant V.

II)In statistical mechanics or kinetic theory, we start out knowing the possible microstates for a given energy U, counting the microstates gives W, and define S as -klnW. Entropy has a definition completely independent of T.

So it really depends on what perspective you're taking when you want to "define" something. Defining T in your way would have problems from both perspectives.I think elaborating on extensive/intensive quantities will resolve your confusion. Here are the definitions of intensive and extensive:

------
Suppose F is some thermodynamic coordinate; for example: energy, entropy, temperature, pressure, volume, etc. (Remember quantities like work and heat don't count as thermodynamic coordinates because they aren't an "exact differential").

Suppose F is a function of other thermodynamic quantities {A, B, ...} . We write this as
F = F(A,B,...). Thus F can be expressed in differential form as dF = AdB + AdC +...+BdA +BdC+...

If F is an extensive quantity, then if we scale (i.e. multiply) each of the quantities it depends on by a constant value, then F scales by the same real value λ in ℝ.

F(λA, λB, ...) = λF(A, B, ...)

Whereas if F is an intensive quantity, it does not vary when each of its arguments scales by the same constant.

F(λA, λB, ...) = F(A, B, ...)
-------

The intuitive picture behind these is that certain quantities will scale with the system's size, and others will not change. For example, imagine an ideal gas in a container into which we can insert a wall, splitting it into two isolated systems of the same size. Suppose the gas starts out with N particles, a pressure of P, a temperature of T, and an energy U. Now we insert the wall splitting it into two halves. Both of the new half-systems now have the same pressure P and temperature T, and both of them have half of the original energy U and half the original number of particles N. T and P are intensive, whereas U and N are extensive.

So we can see that we want T to be intensive so that a subset of a system in thermal equilibrium has the same T as the entire system. On the other hand, the entropy of some subset of a system should not be equal to the entropy of the entire system: a small subset of n particles from a system (e.g., ideal gas) of N>n particles has many fewer possible microstates than the entire system of N particles, so we want entropy to be extensive.

Now to address your suggestion of 1/T = ∂W/∂U. Let me call T "pseudotemperature." As the alternative, we will refer to normal temperature as θ, where 1/θ=∂lnW/∂U .

When we define 1/T = ∂W/∂U, we would be taking a statistical mechanics perspective, since we are starting out with microstates as the fundamental quantity and defining T in terms of W. (As opposed to starting with the zeroth law using it to ensure the existence of T and defining S in terms of T.)

In general, if system 1 has W1 possible microstates, and system 2 has W2 possible microstates, the combined system will have W = W1W2 possible microstates.

Let's go back to the example of slicing a system of energy U into two equal halves. Initially, there are W microstates available. When we insert the wall between the two halves, each half will have W1 = √W possible microstates, by the last paragraph. However, each half now has half the initial energy U, so U1=U/2.

Now we are in trouble. Let's compute the change in pseudotemperature during this slicing process.
1/T1 = ∂W/∂U = √W/((1/2)U)=2√W/U ≠ W/U = 1/T
=> ΔT≠0

Whereas
1/θ1=∂lnW/∂U=ln(√W)/((1/2)U) = (1/2)ln(W)/((1/2)U) = lnW/U = 1/θ.
=> Δθ=0

Thus pesudotemperature for a tank of gas would change if you simply inserted a wall. This is unsatisfactory and violates the zeroth law. It's not an intensive quantity. On the other hand, θ, the true temperature, doesn't change when you slice a system in half, and is an intensive quantity.

(please forgive my mathematical sloppiness, but I think the demonstration adequately shows the nonintensivity of T without being overly mathematically fussy.)
 
Last edited:
  • #6
Jolb said:
Whereas
1/θ1=∂lnW/∂U=ln(√W)/((1/2)U) = (1/2)ln(W)/((1/2)U) = lnW/U = 1/θ.
=> Δθ=0

Does this imply that lnW / U is a valid definition of 1/θ ? I see that the change in ln W divided by the change in U is equal to ln W / U when you divide the system in two parts, but why should ln W / U be a definition of θ? You're taking no derivatives...
 
  • #7
very nice answer, thanks a lot. I might need to spend some time thinking of all this, after all I have never studied statistical mechanics, only followed a course in thermodynamics.

One, maybe a bit silly, question though:
Why is that it is so important that temperature is an intensive quantity? As far as I can see, the main reason for temperature to be defined, is to have a quantity that measures, which way energy will flow from a statistical point of view. Wouldn't it be possible to define a temperature in my way, that would have this property too?

(It's late, and I haven't thought it all through. Therefore when I wake up tomorrow I might have figured it out myself, if it's obvious.)
 
  • #8
swooshfactory said:
Does this imply that lnW / U is a valid definition of 1/θ ? I see that the change in ln W divided by the change in U is equal to ln W / U when you divide the system in two parts, but why should ln W / U be a definition of θ? You're taking no derivatives...

You're absolutely right. ln W/ U is not a valid definition, and there are some other problems in that derivation. I believe to do this in full mathematical rigor you need to write down the partition function.

However, I think I can make the argument a bit more satisfactory. I'll perform approximations of the partial derivative for both the initial system and then again for the halved system. I'll be thinking of an ideal gas as my system in question.

Here's the approximation method I'll use
----
We can approximate an infinitesimal change in N, ∂N, by removing a single particle. This is an infinitesimal change in the N→∞ limit. If the ideal gas of N particles resides in a volume V, I'll do this change by splitting off (using the wall method) a volume of V/N. So:

Vfinal-Vinitial=(V-V/N)-V=-V/N=ΔV≈∂V in the N→∞ limit
similarly ∂U≈-U/N

Now an ideal gas is made of non-interacting particles. Let's say each individual particle has Wind possible microstates. Then N particles will have WindN possible microstates. So here are the infinitesimal changes in W and lnW:

so Wfinal-Winitial=WindN-1-WindN=Winitial(N-1)/N-Winitial=ΔW≈∂W

lnWfinal-lnWinitial=lnWindN-1-lnWindN=(N-1)lnWind - NlnWind = -lnWind= -ln(Winitial1/N) = -(1/N)lnWinitial=ΔlnW≈∂lnW

so we get the approximations
1/T≈ΔW/ΔU=(WindN-1-WindN)/(-U/N) = -N(Winitial(N-1)/N-Winitial)/U
1/θ≈ΔlnW/ΔU = -(1/N)lnWinitial/(-U/N) = lnWinitial/U
----

So now I'll go back to the system initially at V, U, N, and T or θ which gets halved by insertion of a wall. (The subscript 1 denotes the quantities in, say, the left half system after this splitting.)

1/T=∂W/∂U ≈ -N(W(N-1)/N-W)/U
1/T1=∂W1/∂U1≈-N(W1(N-1)/N-W1)/U1=-N(W(1/2)*(N-1)/N-W1/2)/(U/2)=-2N(W(N-1)/2N-W1/2)/U≠1/T
so ΔT≠0

1/θ=∂lnW/∂U≈lnW/U
1/θ1=∂lnW1/∂U1≈lnW1/U1=lnW1/2/((1/2)*U)=lnW/U = 1/θ
so Δθ=0

Why is that it is so important that temperature is an intensive quantity? As far as I can see, the main reason for temperature to be defined, is to have a quantity that measures, which way energy will flow from a statistical point of view. Wouldn't it be possible to define a temperature in my way, that would have this property too?
You're welcome to define any function you want! But there are two reasons why we like our particular version for θ. The first is that it corresponds to what we colloquially know as temperature, like in the weather or cooking or etc. The second is that the direction of heat flow--from A to B, from B to A, or in equilibrium between A and B-- correspond nicely to θAB, θAB, and θAB respectively. If you want to do another algebraic step to figure out the direction of heat flow, that's okay, but it would probably be more complicated (and would probably reduce to finding an inequality of quantities proportional to θ.)
 
Last edited:
  • #9
Here's a related issue to bear in mind. When you define S = ln W (pay no attention to the constant k, it is pure convention), so you are taking the statistical mechanics perspective as explained above, you are making additional assumptions before you can connect it with the thermodynamic concept of T. The main one is you are assuming all the microstates are equally likely, so you are ignoring any history of the system and any of the details of how it got that way or how it can get other ways. This connects to the concepts of reversibility and thermodynamic equilibrium, all of which connect to the thermodynamic concept of T. So as soon as you mention "T", you are actually saying a lot more than just invoking a mathematical expression, you are also invoking a host of implicit thermodynamic assumptions that need to be in place for that mathematical expression to have any value or meaning. Reversibility means that dS = dQ/T, so you could also define T that way. Then if you want to worry about the discrete character of S, you also have to worry about the discrete character of Q. Using derivatives means we have chosen to "gloss over" discretenesses in energy transfer as well as state counting, which is necessary because we are also glossing over quantum mechanics. Ask yourself this-- how did we do classical statistical mechanics when counting states is fundamentally quantum mechanical? We did it because ln W undergoes changes that don't depend on W, they depend on dW/W. So if we neglect that electrons have spin and count every state once instead of twice (for the two spin states), we still get all the same answers for classical statistical mechanics.
 

FAQ: How can you differentiate entropy

What is entropy and why is it important in science?

Entropy is a measure of the disorder or randomness in a system. It is important in science because it helps us understand the direction and extent of changes in a system and is a fundamental concept in thermodynamics and information theory.

How can entropy be calculated or measured?

Entropy can be calculated using the equation S = k ln W, where S is entropy, k is the Boltzmann constant, and W is the number of microstates in a system. It can also be measured experimentally using various methods depending on the system, such as changes in temperature, pressure, or energy.

How does entropy relate to the second law of thermodynamics?

The second law of thermodynamics states that in any natural process, the total entropy of an isolated system always increases over time. This means that the universe tends towards a state of maximum entropy, where there is no potential for further change or work to be done.

Can entropy be reversed or decreased?

In a closed system, the total entropy can remain constant, but it cannot decrease. However, in an open system, energy and matter can be exchanged, allowing for local decreases in entropy. For example, living organisms can decrease their own entropy by consuming energy from their environment and maintaining a high level of order and organization within their bodies.

How is entropy related to information theory?

In information theory, entropy is a measure of uncertainty or randomness in a system. The more uncertain or random a message or data is, the higher its entropy. This concept is used in data compression, where a low-entropy message can be compressed into a smaller file size, while a high-entropy message cannot be compressed as much without losing information.

Back
Top