Is Entropy a State Function for Isolated Systems?

In summary, the conversation discusses the concept of entropy as a state function and its relation to isolated systems. It is argued that entropy is a state function, but its change along a reversible path may be positive or negative. The idea of the universe as an isolated system and its entropy not being a state function is also explored. The discussion touches on the concept of calortropy and the usage of different names for non-equilibrium entropy. Finally, it is concluded that the change in entropy of the universe is a path dependent function, but there is only one reversible path between two states.
  • #1
Andrew_
13
0
If we consider an isolated system in which a process occurs, then according to the clausius inequality :

[tex] dS \geq \frac{dQ}{T}[/tex]

Since dQ = 0 , it follows that if the process occurs reversibly dS = 0 and irreversibly dS > 0. But entropy is a state function , how could this possibly be ?

It then occurred to me that the universe itself is an isolated system and its entropy is not a state function. Could we then generalize and say that the entropy of an isolated system is not a state function ?

I've found some very brief treatments on this in recent thermodynamic books , but none seems convincing to me. In one book , it is claimed that an irreversible process that corresponds to a reversible process cannot exist in an isolated system since there needs to be the 'uncompensated heat' ( N , according to Clausius ) that must be supplied to the system from the surroundings. Thus , the system is no longer isolated, and the clausius inequality holds no longer for such an irreversible path.

Makes sense ? ... any ideas ?
 
Science news on Phys.org
  • #2
Interesting question. My reference is not handy, but I would guess that the entropy is not a state function. Although we can identify an entropy based on micro/macrostates of an equilibrium configuration, the change in entropy as one goes from one equilibrium state to another is, in general, path dependent, IIRC.

Again, I need to double-check my reference (Rational Thermodynamics).
 
  • #3
Andrew_ said:
If we consider an isolated system in which a process occurs, then according to the clausius inequality :

[tex] dS \geq \frac{dQ}{T}[/tex]

Since dQ = 0 , it follows that if the process occurs reversibly dS = 0 and irreversibly dS > 0. But entropy is a state function , how could this possibly be ?

It then occurred to me that the universe itself is an isolated system and its entropy is not a state function. Could we then generalize and say that the entropy of an isolated system is not a state function ?

I've found some very brief treatments on this in recent thermodynamic books , but none seems convincing to me. In one book , it is claimed that an irreversible process that corresponds to a reversible process cannot exist in an isolated system since there needs to be the 'uncompensated heat' ( N , according to Clausius ) that must be supplied to the system from the surroundings. Thus , the system is no longer isolated, and the clausius inequality holds no longer for such an irreversible path.

Makes sense ? ... any ideas ?

Entropy is a state function. It is the integral of dQ/T along the reversible path between two states. The change in entropy of a system in a reversible process is 0 only if the path is closed (ie. the system returns to the original state). The entropy change of a system along a reversible path from A to B can be negative or positive. The entropy change of the system + surroundings between any two states can never be negative. It will only be 0 if the system and surroundings return to the original state.

AM
 
  • #4
Yes, entropy is certainly a state function. Andrew (first Andrew), while it may be true that a reversible process could be modeled by an irreversible process followed by heat removal, I don't see what relevance this has. The substitution is for our own convenience; nobody is claiming that thermal energy must be periodically withdrawn from our universe.

If I've misunderstood your question, could you please give the book reference?
 
  • #5
Andy Resnick, it would be interesting if you provide a rational thermodynamics perpsective because I am myself still new to this field.

Andrew Mason, I agree with you. What you posted is well known , nothing controversial there, but how would you explain the fact that the entropy of the universe - an isolated system - is a path function ?

Mapes, if you assume that a reversible process occurs in an isolated system , then its complementary irreversible path cannot occur without supplying heat/work to the system. Since in general :

[tex]W_{rev} + N = W_{irr} [/tex]

and N > 0 , it follows that to achieve the irreversible path , the system is no longer isolated. Thus , the clausius inequaity is no longer valid. This is a realistic situation , not a conceptual one , wouldn't you agree ? This could make sense afterall because the clausius inequality is derived on the basis of a non-isolated system which exchanges energy with the surroundings.

This is the treatment I found in http://books.google.com/books?id=RZE2aYsPe0QC" as well in the same author's more recent ( yet unpublished ) book on classical thermodynamics , I just got a copy of the book from one of the co-authors.
 
Last edited by a moderator:
  • #6
Please note that in generalized thermodynamics , the author introduces something called "calortropy" , which is the usual entropy but for irreversible processes.

He writes , for example, the clausius inequality [tex] d \Psi \geq 0[/tex] for isolated systems , where psi = calortropy. Why he chooses a different name for non-equilibrium entropy ... I am yet to find out.
 
  • #7
Andrew_ said:
Andrew Mason, I agree with you. What you posted is well known , nothing controversial there, but how would you explain the fact that the entropy of the universe - an isolated system - is a path function ?
If you mean that the entropy of the universe is a path dependent function, this is an incorrect statement. The change in entropy of the universe between states A and B is the integral of dQ/T along the reversible path between those two states. There is only one such path.

AM
 
  • #8
Andrew_ said:
Mapes, if you assume that a reversible process occurs in an isolated system , then its complementary irreversible path cannot occur without supplying heat/work to the system.

No macroscale process is truly reversible. We often assume reversibility to make thermodynamic problems easier to solve, but any proof relying on actual reversibility of an arbitrary process in the known universe is wrong. It's as wacky as assuming a frictionless universe and then saying that something is wrong with our theories of friction when friction is actually observed.
 
  • #9
Mapes said:
Yes, entropy is certainly a state function. Andrew (first Andrew), while it may be true that a reversible process could be modeled by an irreversible process followed by heat removal, I don't see what relevance this has. The substitution is for our own convenience; nobody is claiming that thermal energy must be periodically withdrawn from our universe.

I understood the question something like this. Entropy is a state function that can be assigned to a system in equilibrium. An isolated system is either in equilibrium or not in equilibrium. If it is in equilibrium, then entropy will not increase. If it is not in equilibrium, we cannot assign an entropy to it. So it seems that we cannot see entropy increase in an isolated system? As I understand, the traditional cheat around this is to introduce a "constraint". Consider a thermally isolated box with a wall in the middle that constrains all the gas molecules to be on one side of the box. Each side of the box is in equilibrium, so presumably we can assign an entropy to it. Now remove the constraint, and the gas expands irreversibly to fill the box. When it finally attains equilibrium, we can again assign an entropy to it. So with the help of a "constraint", we can see entropy increase in an "isolated" system. (But if it's "isolated", who removed the "constraint"?) Now, how do we find the entropy change associated with this irreversible adiabatic free expansion? We could find a reversible path that takes the system from the same initial to final states, and integrate dQ/T along it. One such reversible path involves making the system do work and then heating it. (But if you can heat the system, how can it be "isolated"?)
 
  • #10
atyy said:
(But if you can heat the system, how can it be "isolated"?)

The "heating" here is virtual, though, yes? Just an imaginary process used to simplify the calculation?
 
  • #11
After further reading, I am more confused than before. Hopefully one day I'll have the time to carefully work through this subject.

First, I have to correct myself: for equilibrium systems and reversible processes, the entropy is a state function: the entropy is a functional only of the current state of the system (no memory).

However, I do not understand how to apply Clausius' expression [itex]\int\frac{dQ}{T}[/itex] to irreversible processes or nonequilibrium conditions. Is 'T' even defined? What does dQ mean for a dynamic process? The more I read, the more I guess that while the entropy can be defined for a given process, the less meaning 'state' has. The Onsager relations appear to be a perburbation approach (using fluxes and forces), but I just don't know enough to say any more than that.
 
  • #12
Q

Andy Resnick said:
However, I do not understand how to apply Clausius' expression [itex]\int\frac{dQ}{T}[/itex] to irreversible processes or nonequilibrium conditions. Is 'T' even defined? What does dQ mean for a dynamic process? The more I read, the more I guess that while the entropy can be defined for a given process, the less meaning 'state' has. The Onsager relations appear to be a perburbation approach (using fluxes and forces), but I just don't know enough to say any more than that.
The process determines the final state. If the process is irreversible, the final state is different than in a reversible process.

For each set of initial and final states there is a reversible path between the two states. The change in entropy is the integral of dQ/T along that path.

Perhaps it is best illustrated with an example. Consider an adiabatic expansion of a gas at pressure P1, temperature T1 and volume V1. It expands adiabatically until its volume doubles. So the process is adiabatic free expansion to double the volume.

The state of the gas is determined by the three variables, P, V and T. If you take the irreversible path of free expansion, with the result that no work is done, there is no change in internal energy (Q=0, W=0). So the final state is P=P1/2, T=T1, V=2V1.

This is a very different state than a reversible adiabatic expansion to double the volume. In this case it would be a process in which the internal and external pressure differ by an infinitessimal amount at all times, so work is done as the gas expands to double the volume. Consequently, the internal energy of the gas is reduced by the amount of work done and, as a result the final state is P<P1/2; T<T1; V=2V1. So the final state of the reversible expansion is different than in the non-reversible path.To find the change in entropy of the adiabatic free expansion you find a reversible path between the initial and final states. In order to do this, heat must flow into the gas (there is no reversible adiabatic path between the initial and final states) so the integral of dQ/T (entropy change) is greater than 0.

AM
 
  • #13
Is that true? For example, when a protein folds, your statement seems to imply that only one path through configuration space is allowed, but that contradicts experiment. Plus, there is no reversible path under physiological conditions, AFAIK.

Has anyone worked out the entropy for a cycle of Na-K-ATPase, or a motor protein, or anything like that?

Plus, your argument leaves out many other thermodynamic variables such as the magnetic field, particle number, etc. etc.
 
  • #14
If the changes in the isolated system are slow enough that they proceed in a quasistatic way, you can still apply thermodynamics by enlarging the thermodynamic state space. Then the entropy is a state function, but the thermodynamic state it depends on is then defined by many more external variables, temperatures, particle numbers etc. etc. E.g. you can think of a gas in an isolated box described by a position dependent temperature and density.

Now, if the changes in the system do not proceed quasistatically, you can still find an equivalent quasistatic process that describes the irreversible transition. Example: Consider a mixture of hydrogen and oxygen in a an isolated container. This gas explodes, but the container is strong enough to contain te explosion. No heat leaks out, and eventually the gas will settle down and it will be in a state that can be described by thermodynamics.

An equivalent quasistatic process can happen using some catalist that let's the oxygen slowly react with hydrogen. You can then interpolate from the initial state to the final state by introducing the number of molecules of hydrogen, oxygen and water. We then have:

dE = T dS - P dV + mu_{O2}dN_{O2} + mu_{H2}dN_{H2}+mu_{H2O}dN_{H2O}


During the process, dE = 0, and dV = 0, so during the chemical reactions we have:


dS = [mu_{O2}dN_{O2} + mu_{H2}dN_{H2}+mu_{H2O}dN_{H2O}]/T

Then the changes in the particle numbers also determine the temperature change, because we know how the internal energy depends on the particle numbers and temperature and the internal energy stays constant.

We can thus integrate dS and dT from the initial state to the fiinal state along the quasistatic trajectory. All this follows from the generalized version of the fundamental thermodynamic relation.


Another example: Free expansion.

As explained above by Andrew Mason, the entropy will increase, and there is a reversible way to realize this process, but that involves an external heat source. So, you let the gas expand adiabatially so that it performs work and then you supply heat from an external source (which can be done in a way that is arbitrary closely to being reversible).

But there is third way which is quasistatic and does not involve an external heat source. We imagine that the volume V of the system is divided in two parts: V = V1 + V2, and all the gas is located in V1. The volume V2 is vacuum. We then split the vacuum section by making a new boundary very close to the boundary between V1 and V2. Then we remove the old boundary so that we get a free expansion of the gas into a slightly larger volume.

Clearly the gas does not perform any work in this process. Because when we remove the old boundary, the new boundary has been fixed in place. The gas expands and then bumps into the new boundary. No energy is lost from the gas or extracted in the form of work. So, the internal energy of the gas stays what is was.

But if we have increased the volume infinitessimally, then the new state of the gas (when it has settled down) can be computed from the fundamental thermodynamic relation as this relation relates two thermodynamic states that are infinitessimally separated. It doesn't matter if you go from one state to the other in a reversible way or not. So we have:

dE = T dS - P dV

Now, dE = 0, therefore:

dS = P dV/T

Note that we are not saying here that P dV is work performed during the free expansion. All we are using is the fact that if you have two states in thermal equilibrium that differ infinitesimally in S and V by dS and dV, then you have that dE = T dS - P dV. We then choose the two states by considering an infinitessimal free expansion starting from a given state. This is then a quasistatic process that is irreversible.

We can repeatedly insert a extra boundary and remove the old boundary, so this process can be repeated until the volume is increased by a large amount. The temperature remains constant, and we have that P = N k T/V. So, we can integrate the expression for dS to obtain the final entropy.
 
  • #15
Count Iblis said:
Another example: Free expansion.

...

But there is third way which is quasistatic and does not involve an external heat source. We imagine that the volume V of the system is divided in two parts: V = V1 + V2, and all the gas is located in V1. The volume V2 is vacuum. We then split the vacuum section by making a new boundary very close to the boundary between V1 and V2. Then we remove the old boundary so that we get a free expansion of the gas into a slightly larger volume.
I am not sure that this is a reversible or quasistatic process. If you reverse the process, you have to do work on the system to compress the gas. How can a "reversible" process in one direction do no work at all and with an infinitessimal change in conditions, the reverse process requires work?

AM
 
  • #16
atyy said:
(But if you can heat the system, how can it be "isolated"?)

Mapes said:
The "heating" here is virtual, though, yes? Just an imaginary process used to simplify the calculation?

Count Iblis said:
But there is third way which is quasistatic and does not involve an external heat source. We imagine that the volume V of the system is divided in two parts: V = V1 + V2, and all the gas is located in V1. The volume V2 is vacuum. We then split the vacuum section by making a new boundary very close to the boundary between V1 and V2. Then we remove the old boundary so that we get a free expansion of the gas into a slightly larger volume.

Clearly the gas does not perform any work in this process. Because when we remove the old boundary, the new boundary has been fixed in place. The gas expands and then bumps into the new boundary. No energy is lost from the gas or extracted in the form of work. So, the internal energy of the gas stays what is was.

But if we have increased the volume infinitessimally, then the new state of the gas (when it has settled down) can be computed from the fundamental thermodynamic relation as this relation relates two thermodynamic states that are infinitessimally separated. It doesn't matter if you go from one state to the other in a reversible way or not. So we have:

dE = T dS - P dV

Now, dE = 0, therefore:

dS = P dV/T

Note that we are not saying here that P dV is work performed during the free expansion. All we are using is the fact that if you have two states in thermal equilibrium that differ infinitesimally in S and V by dS and dV, then you have that dE = T dS - P dV. We then choose the two states by considering an infinitessimal free expansion starting from a given state. This is then a quasistatic process that is irreversible.

We can repeatedly insert a extra boundary and remove the old boundary, so this process can be repeated until the volume is increased by a large amount. The temperature remains constant, and we have that P = N k T/V. So, we can integrate the expression for dS to obtain the final entropy.

Count Iblis's solution seems very clean to me. There's still the question of who removes the constraint of an "isolated" system, but I suppose that's done by the expansion of the universe?
 
  • #17
I still don't understand why we have to rely on concepts like "slow enough", "infinitesimally", etc. Do you mean thermodynamics does not apply to explosions, or fractures, or femtosecond processes, ... for example?
 
  • #18
Andy Resnick said:
I still don't understand why we have to rely on concepts like "slow enough", "infinitesimally", etc. Do you mean thermodynamics does not apply to explosions, or fractures, or femtosecond processes, ... for example?
The laws of thermodynamics certainly apply. But they are not reversible processes. No real physical processes are. We study reversible processes because they establish limits on the usefulness of thermodynamic processes.

AM
 
  • #19
Andrew Mason said:
I am not sure that this is a reversible or quasistatic process. If you reverse the process, you have to do work on the system to compress the gas. How can a "reversible" process in one direction do no work at all and with an infinitessimal change in conditions, the reverse process requires work?

AM

It is an irreversible quasistatic process in the limit that you change the volume by infinitessimal steps.

It is similar to supplying heat to a system in a quasistatic way. If you supply a large amount of heat Q to a system then the entroopy will increase. But during this process, thermodynamics does not apply. An equivalent quasistatic process is to divide the Q into small chunks of dQ and then after each dQ goes into the system, the entropy increases by dQ/T. The reason why is simply that before you add the dQ and after you add it the gas is in thermal equilibrium (we simply wait long enough before adding the next dQ). We can then take again the relation:

dE = T dS - P dV (1)

and take the dE, dS and dV to be the changes in the state variables. We then have dV = 0, dE = dQ and thus dS = dQ/T.


Of course, this can be a reversible process if the heat is added in a reversible way. But this is not necessary. You can take a gas cylinder and add the heat by some irreversible process, e.g. by rubbing the outside of the cylinder thereby converting work to heat.


Anyway, the point is that the fundamental thermodynamic relation (1):


dE = T dS - P dV


or an appropriately generalized version, can be appied to processes that are quasistatic as it relates state variables that differer by an infinitessimal amount. Whether or not such a process is reversible is not relevant at all here.


If a process is reversible, then T dS must be the heat supplied to a system (which then must also be supplied in a reversible way) and P dV must be the work done by the system.


So, in this sense Eq. (1) is more fundamental than the relation
dS = dQ/T. The usual derivation of (1) involves two arguments. First one argues that in the limit of changing the volume (or other external variables) infinitely slowly, the adiabatic theorem of quantum mechanics implies that the system's entropy will not change. This combined wit the definition of temperature implies (1) in the special case that the system does reversible work.


Then one argues that since (1) involves state variables, the relation should be valid in general. There doesn't need to be any transition from one state to another state at all. You can simply have two systems with the same composition, one with internal energy E, entropy S and volume V and another system with internal energy
E +dE, volume V + dV and entropy S + dS and then dE, dV and dS must satisfy (1).
 
  • #20
Andy Resnick said:
I still don't understand why we have to rely on concepts like "slow enough", "infinitesimally", etc. Do you mean thermodynamics does not apply to explosions, or fractures, or femtosecond processes, ... for example?

Thermodynamics doesn't apply in the general case. Because ultimately you have a system that consists of a huge number of particles. You want to describe the situation in a statistical way. Thermodynamics applies when given some external variables, the system has equal probability to be in any of the possible microstates.
 
  • #21
Hang on there.. that's definitely not true. Thermodynamics, and the first and second laws, hold for *all* physical systems. Are you claiming that conservation of energy does not hold for 'the general case'?
 
  • #22
Andy Resnick said:
Hang on there.. that's definitely not true. Thermodynamics, and the first and second laws, hold for *all* physical systems. Are you claiming that conservation of energy does not hold for 'the general case'?

Energy is conserved, but if the system is not specified, we don't know what the external variables are. Then the first law is just a trivial statement that says that if we make a paticular choice, we define work to be the change in energy associated with whatever one chooses for the external variables. You can e.g. think of a biological system in which we keep track of a few variables for each cell.

Heat is the change in energy that occurs while keeping the external variables fixed. Then this can be seen as defining a coarse graining of the system. E.g. apart form the few variables that describes all the cells, we don't know what is going on exactly in each cell.

Then entropy is the Log of the number of microstates consistent with each macrostate. The system is in termal equilibrium if given the macrostate specified by the set of external variables, all the microstates states are indeed equally likely. If that's not the case, then the thermodynamic description will break down.

You can then still apply the more general thermodynamic law that says that the entropy will always increase, similar to saying that in the free expansion experiment when we suddenly double the volume the entropy will increase. But you cannot use thermodynamics to describe what is going on during this process.

There is way around this if the irreversible change happens very slowly. Then you could introduce additional external variables that describes how the change is happening. It must then be the case that at any stage, the system's actual microstate is typical in the sense that averaging over all the possible microstates where the system could be in if they are all assumed to be equally likely, will not yield different answers for macroscopic observables.


So, in one of my previous postings where I let the free expansion happen in small stages, every time the volume was increased by an infinitessila amount the gas returns to equilibrium. Then all the microstates consistent with the new volume are equally likely and the thermodynamic equations apply.

In case of free expansion in which the volume is doubled right away, you could atempt to describe the gas as it shoots out to occupy the new volume by introducing a position dependent flow velocity, temperature and density. But it is then not clear if the temperature is well defined. In general you should describe the gas using a distribution function f(r,v,t) which gives the number of molecules per unit volume in velocity space as a function of position and then solve the Boltzmann equation.

If it is the case that f(r,v,t) is always locally very close to a Maxwell distribution, then that means the assumption o local thermal equilibrium is valid.

One can also generalize the definiton of the entropy to:

S = -k Sum over P_r Log(P_r)

where P_r are the probabilities for each microstate and then argue that general nonequilibrium states can be described by thermodynamics as well. But if we want to apply thermodymamics then we need to know these probabilities. So, one has to have some formal description of the P_r in terms of a finite number of "external" variables and then the points I make above are still relevant.
 
  • #23
I hear what you are saying, I am saying that it is inherently limiting to base thermo*dynamics* on arguments involving *static* (equilibrium) states. Irreversible processes exist, we are obliged to confront them directly. Thermodynamics most definitely applies to all bodies undergoing any general deformation. That we do not have the means to express thermo*dynamics* using the language of thermo*statics* is not surprising.
 
  • #24
Andy Resnick said:
I still don't understand why we have to rely on concepts like "slow enough", "infinitesimally", etc. Do you mean thermodynamics does not apply to explosions, or fractures, or femtosecond processes, ... for example?

Perhaps because it seems almost everyone is using a classical thermodynamic approach. But classical thermodynamics is not truly thermodynamics , it's simply thermostatics.

Andy Resnick said:
However, I do not understand how to apply Clausius' expression to irreversible processes or nonequilibrium conditions. Is 'T' even defined? What does dQ mean for a dynamic process? The more I read, the more I guess that while the entropy can be defined for a given process, the less meaning 'state' has. The Onsager relations appear to be a perburbation approach (using fluxes and forces), but I just don't know enough to say any more than that.

You raise an interesting point here. How could one define non-equilibrium entropy based on the equilibrium temperature ? dS = dQ / T + dN makes no sense at all , or so it seems. I don't even understand the point of formulating this equation when in non-equilibrium thermodynamics one can obtain a generalized equation for the non-equilibrium entropy without the need for this equation.
 
  • #25
Andrew Mason said:
If you mean that the entropy of the universe is a path dependent function, this is an incorrect statement. The change in entropy of the universe between states A and B is the integral of dQ/T along the reversible path between those two states. There is only one such path.

AM

How could this possibly be ? Every single classical thermodynamics book divides the universe into a system and its surroundings. The entropy of the system is a state function while that of the surroundings is not. It follows then that the entropy of the universe is a path function. For any reversible change , the entropy production in the universe is zero. For any irreversible change , the entropy production in the universe is always greater than zero. How is the entropy of the universe a state function then ?
 
  • #26
Andrew Mason said:
The laws of thermodynamics certainly apply. But they are not reversible processes. No real physical processes are. We study reversible processes because they establish limits on the usefulness of thermodynamic processes.

AM

Your last sentence is not really true. Processes are idealized as reversible for the same reason a system is idealized as having an equilibrium configuration. The utility of having equilibrium configurations and reversible processes is that it is then possible to unambiguously define a temperature.

Fortunately, biological processes are isothermal and isobaric. Thus, I am prepared to throw away the concept of 'temperature', keeping it only as a limiting case. This may not be as heretical as it may seem- laser light, or any non-blackbody radiation, does not have a well-defined temperature either. Yet, one may perform work or heating with laser light. Chemical reactions are likewise amenable to a thermodynamic treatment with no requirement of equilibrium: steady-state suffices.

It seems to me that the essential thermodynamic variable to keep is the Gibbs free energy.
 
  • #27
Count Iblis said:
Anyway, the point is that the fundamental thermodynamic relation (1):


dE = T dS - P dV


or an appropriately generalized version, can be appied to processes that are quasistatic as it relates state variables that differer by an infinitessimal amount. Whether or not such a process is reversible is not relevant at all here.

What would you recommend to read more about this? Or is it already in old texts like Reif?
 
  • #28
atyy said:
What would you recommend to read more about this? Or is it already in old texts like Reif?

Yes, Reif is one of the best books. Reif explains everything from first principles. You may have to be careful with the way Reif (or for that matter any other author) defines things. So, it can be the case that an author simply defines a quasistatic process to be an infinitessimal process that it is reversible. So, you have to be careful about what follows from the theory and what is an arbitrary definition.

Bad books are books that do not explain things from first principles, these are mostly books written for chemists and engineers. They will typically explain thermodynamics starting from notions as heat, temperature etc., without defining them. Then they arrive at some definition of entropy by arguing that the integral of dq/T along a closed contour is zero.

The problem with this approach is that it assumes that heat, temperature, Work, etc. are fundamental concepts that somehow are always unambigiously defined. In reality everything depends on your coarse graining procedure that you need to do to define the entropy starting from the microscopic description of the system.

This is yet another example of why http://insti.physics.sunysb.edu/~siegel/history.html"
 
Last edited by a moderator:
  • #29
Andrew_ said:
How could this possibly be ? Every single classical thermodynamics book divides the universe into a system and its surroundings. The entropy of the system is a state function while that of the surroundings is not.
Can you provide a reference for the last statement?.

The 'surroundings' is just what is external to the 'system'. Why would the laws of thermodynamics depend on what we arbitrarily call the system and what we call the surroundings?

It follows then that the entropy of the universe is a path function. For any reversible change , the entropy production in the universe is zero. For any irreversible change , the entropy production in the universe is always greater than zero. How is the entropy of the universe a state function then ?
Well, the universe is not in thermal equilibrium. So, it does not really have a particular thermodynamic state.

Let's make it simple and consider the "system" and its "surroundings" which begin in particular thermodynamic equilibrium states (A and A' for the system and surroundings respectively) and, after a thermodynamic process, end up in different thermodynamic equilibrium states (B and B' for the system and surroundings respectively).

In order to determine the change in entropy, you compute the integral of dQ/T on the reversible path from A to B for the system and the integral of dQ/T on the reversible path from A' to B' for the surroundings. If the sum of those two integrals is greater than 0, you know that the path of the actual process was not the reversible path.

Example:

Consider 1 kg of water at 0C contained in a thin plastic bag and a lake at 10C (state A). I place the bag in the lake. Heat flows into the bag of water from the lake until the bag of water is 10C, the lake is still at 10C (state B). What is the change in entropy?

The change in entropy of the system (the bag of water) + the surroundings (the lake) is:

[tex]\Delta S_{bag} = \int_{A}^{B} dQ_{bag}/T_{bag}[/tex]

[tex]\Delta S_{lake} = \int_{A'}^{B'} dQ_{lake}/T_{lake}[/tex]

[tex]\Delta S_{total} = \Delta S_{bag} + \Delta S_{lake}[/tex]


In order to determine the change in entropy, you have to find the reversible path from state A to state B for the bag. This would be a Carnot heat engine operating between the bag and the lake in which the work output would be stored and not converted to heat (say by lifting a weight). The total heat transferred to the cold reservoir (the bag of water) is raises the 1kg of water 10 degrees - 10 Kcal.

But in this process, the amount of heat lost by the lake would be greater than the heat gained by the water in the bag, (since some of that heat was used to do work which is stored in lifting the weight). So this cannot be the reversible process for the surroundings in going from state A' to B'. In going from A' to B', the reversible path, again, is a Carnot heat engine between the lake and the bag in which the heat lost by the lake is 10Kcal.

Conclusion: when you determine the change in entropy of the system and the change in entropy of the surroundings resulting from a non-reversible process, the paths for calculating the entropy changes of the system and surroundings are different.

AM
 
  • #30
Andrew Mason said:
Can you provide a reference for the last statement?.

Well any book on classical thermodynamics would do. A recent book I read is "Principles of thermodynamics" , By Myron Kaufman. I quote from the book page 75 , the 2nd law of thermodynamics , the last paragraph ,

"Inorder to usefully apply the 2nd law, it will be necessary to be able to calculate dS, the entropy change in the system of interest, and dSsurr , the entropy change of surroundings. The mathematical form of our 2nd law becomes :

dSuni = dS + dSsurr

Although dSuni , the entropy change of the universe, sounds rather formidable, in practice ,we will only be concerned with a few parts of the universe that interact with our system during the process. In this equation, greater than zero applies to any real process and equal to zero applies to a reversible process. In the latter, change occurs infinitely slowly, eliminating energy degradation due to friction and turbulence. "

Andrew_Mason said:
The 'surroundings' is just what is external to the 'system'. Why would the laws of thermodynamics depend on what we arbitrarily call the system and what we call the surroundings?

That's a good question, which I've asked myself several times before, and I still have no logical answer. Unfortunately this is not given importance in thermodynamics books. ( If you know a good refernce, please do tell . ) It seems in certain books the entropy of the surroundings is calculated on the basis of dS = dQ / T.
However with this definition , the surroundings is regarded as any other system, and its entropy should be a state function, but if that's true the entropy of the universe would be a state function too , which is obviously false from the equation posted above.


Andrew_Mason said:
Well, the universe is not in thermal equilibrium. So, it does not really have a particular thermodynamic state.

Let's make it simple and consider the "system" and its "surroundings" which begin in particular thermodynamic equilibrium states (A and A' for the system and surroundings respectively) and, after a thermodynamic process, end up in different thermodynamic equilibrium states (B and B' for the system and surroundings respectively).

In order to determine the change in entropy, you compute the integral of dQ/T on the reversible path from A to B for the system and the integral of dQ/T on the reversible path from A' to B' for the surroundings. If the sum of those two integrals is greater than 0, you know that the path of the actual process was not the reversible path.

Example:

Consider 1 kg of water at 0C contained in a thin plastic bag and a lake at 10C (state A). I place the bag in the lake. Heat flows into the bag of water from the lake until the bag of water is 10C, the lake is still at 10C (state B). What is the change in entropy?

The change in entropy of the system (the bag of water) + the surroundings (the lake) is:

[tex]\Delta S_{bag} = \int_{A}^{B} dQ_{bag}/T_{bag}[/tex]

[tex]\Delta S_{lake} = \int_{A'}^{B'} dQ_{lake}/T_{lake}[/tex]

[tex]\Delta S_{total} = \Delta S_{bag} + \Delta S_{lake}[/tex]


In order to determine the change in entropy, you have to find the reversible path from state A to state B for the bag. This would be a Carnot heat engine operating between the bag and the lake in which the work output would be stored and not converted to heat (say by lifting a weight). The total heat transferred to the cold reservoir (the bag of water) is raises the 1kg of water 10 degrees - 10 Kcal.

But in this process, the amount of heat lost by the lake would be greater than the heat gained by the water in the bag, (since some of that heat was used to do work which is stored in lifting the weight). So this cannot be the reversible process for the surroundings in going from state A' to B'. In going from A' to B', the reversible path, again, is a Carnot heat engine between the lake and the bag in which the heat lost by the lake is 10Kcal.

Conclusion: when you determine the change in entropy of the system and the change in entropy of the surroundings resulting from a non-reversible process, the paths for calculating the entropy changes of the system and surroundings are different.

AM

Ok , but how does this prove that the entropy of the system + the surroundings is a state function ? You just proved that | dSsys | does not equal | dSsurr | for an irreversible heat transfer process from the lake to the plastic bag , and more generally their addition is always greater than zero for any real spontaneous process. Now assume the same process occurs reversibly , in this case you cannot argue that the reversible path A-->B differs from the reversible path A'-->B' , both produce the same entropy change but opposite in sign. That is , for a reversible change , dSsys = - dSsurr , so the total entropy change for the plastic and the lake vanishes, in accordance with the 2nd law of thermdynamics.
 
  • #31
Andrew Mason said:
<snip>

Example:

Consider 1 kg of water at 0C contained in a thin plastic bag and a lake at 10C (state A). I place the bag in the lake. Heat flows into the bag of water from the lake until the bag of water is 10C, the lake is still at 10C (state B). What is the change in entropy?

The change in entropy of the system (the bag of water) + the surroundings (the lake) is:

[tex]\Delta S_{bag} = \int_{A}^{B} dQ_{bag}/T_{bag}[/tex]

[tex]\Delta S_{lake} = \int_{A'}^{B'} dQ_{lake}/T_{lake}[/tex]

[tex]\Delta S_{total} = \Delta S_{bag} + \Delta S_{lake}[/tex]


In order to determine the change in entropy, you have to find the reversible path from state A to state B for the bag. This would be a Carnot heat engine operating between the bag and the lake in which the work output would be stored and not converted to heat (say by lifting a weight). The total heat transferred to the cold reservoir (the bag of water) is raises the 1kg of water 10 degrees - 10 Kcal.

But in this process, the amount of heat lost by the lake would be greater than the heat gained by the water in the bag, (since some of that heat was used to do work which is stored in lifting the weight). So this cannot be the reversible process for the surroundings in going from state A' to B'. In going from A' to B', the reversible path, again, is a Carnot heat engine between the lake and the bag in which the heat lost by the lake is 10Kcal.

Conclusion: when you determine the change in entropy of the system and the change in entropy of the surroundings resulting from a non-reversible process, the paths for calculating the entropy changes of the system and surroundings are different.

AM

This is a great example, as it highlights a lot of the shortcomings in the way thermodynamics is traditionally taught

First, there's some assumptions which have not been spelled out- the lake is a reservoir, for one- neither the temperature nor the volume of the lake are allowed to change (actually they can, but the analysis then becomes too complicated to justify the increased model accuracy). Second, the bag is present to prevent diffusion of the cold water into the warmer water- this creates mass loss from thermodynamic system to the reservoir, which is also more difficult to model in the traditional picture.

The approach used above then invokes Carnot cycles and machines, neither of which has anything to do with the actual physics of heat and mass diffusion. Reversible processes are not ever observed in this situation.

It's much easier to simply evaluate the integrals: The heat flow Q through the bag depends on the temperature difference and the specific heat of the enclosed water: Q = c m (T_bag-T_lake), or dQ = cmdT_bag. The entropy integral for the bag is then dS = cm Ln(T_lake/T_i), where T_i is the inital temperature of the bag. If you like, the specific heat can be a function of temperature and the integral suitably evaluated.

Keeping ‘c’ constant, this integral evaluates to 35.9 cal/K. Positive, as the heat flows into the bag.

Now, for the lake. Again, the integral can be evaluated very straightforwardly: dS = 10 kcal/283 K = -35.3 cal/K. We have a potential problem as the entropy of the lake is *decreased* because the heat flowed out of the lake into the bag.

However, the entropy of the two processes together is positive: 0.63 cal/K. Which is as it should be. Also note that time is not part of any of this analysis: in that way, the concept of ‘equilibrium’ is retained- the temperature of the bag is assumed to reach a new value and not change from that.

My point is that traditional textbook presentations of thermodynamics tries so hard to force reversible processes and equilibrium onto the student that the essential physics is obscured and overly complicated. I claim that it is far better to start with the acknowledgment that most processes are irreversible, and then introduce equilibrium as a way to restrict the number of variables. In this way, the Gibbs free energy is the natural starting point (also since the Gibbs free energy change is what is measured in calorimetry experiments)
 
  • #32
My point is that traditional textbook presentations of thermodynamics tries so hard to force reversible processes and equilibrium onto the student that the essential physics is obscured and overly complicated. I claim that it is far better to start with the acknowledgment that most processes are irreversible, and then introduce equilibrium as a way to restrict the number of variables. In this way, the Gibbs free energy is the natural starting point (also since the Gibbs free energy change is what is measured in calorimetry experiments)

The problem is that if you don't teach what equilibrium is, then you cannot even begin to teach what entropy and the other thermodynamic state variables really are. This then leads to misunderstandings that are hard to correct later on.

In fact, I would say that the understanding of the average person who has studied physics at some elementary level is very poor if you compare it with the knowledge people have about other branches of physics. Until last year almost all the Wikipedia thermodynamics pages would have told you that:

dE <= T dS - PdV

See e.g:

http://en.wikipedia.org/w/index.php?title=Fundamental_thermodynamic_relation&oldid=206545149

Wolfram's world of science will still tell you this:

http://scienceworld.wolfram.com/physics/CombinedLawofThermodynamics.html

Obviously Wolfram has copied it from Wikipedia and then some time later the old Wikipedia page decided to use Wolfram's page as a source. :smile:

So, how could Wikipedia editors, who certanly are not complete lay persons, come up with this stupid error that was repeated on many pages?

Their reasoning was that while we always have that:

dW = - P dV

we don't always have that:

dS = dQ/T

In general we have:

dS >= dQ/T

So, in general

dE = dQ - dW <= T dS - P dV


Which is, of course, wrong despite the fact that the entropy change is indeed in general larger than dQ/T.

This flawed type of reasoning was used on the Helmholtz free energy page:

http://en.wikipedia.org/w/index.php?title=Helmholtz_free_energy&oldid=212028025

To come up with a "derivation" of the fact that the free energy of an isolated system can only decrease and that in equilibrum it will attain a minimum value.

These errors were present for many years, yet no one even raised the problem on the talk pages. So, the conclusion has to be that people not only know very well that in general processes are irreversible, but that they think that some thermodynamic identities are not valid in general and that if you replace equalities by inequalities you obtain the general result.

This suggests to me that thermodynamics has to be taught in a far more rigorous way in schools.
 
  • #33
Count Iblis said:
The problem is that if you don't teach what equilibrium is, then you cannot even begin to teach what entropy and the other thermodynamic state variables really are. This then leads to misunderstandings that are hard to correct later on.

<snip>
This suggests to me that thermodynamics has to be taught in a far more rigorous way in schools.

I completely agree with you- thermodynamics needs to be taught in a rigorous way. I never said we should abandon the concept of 'equilibrium', however. I claim we should be more rigorous about presenting equilibrium, which means framing the concept in terms of "how long do we want to wait?"
 

FAQ: Is Entropy a State Function for Isolated Systems?

What is Entropy?

Entropy is a measure of the disorder or randomness of a system. It is a thermodynamic property that describes the distribution of energy within a system.

What is a State Function?

A state function is a property of a system that depends only on its current state and not on how it reached that state. In other words, it is independent of the path taken to reach that state.

Why is Entropy considered a State Function for Isolated Systems?

Entropy is considered a state function for isolated systems because it is a property that only depends on the current state of the system. In an isolated system, there is no exchange of energy or matter with the surroundings, so the entropy of the system remains constant.

How is Entropy related to the Second Law of Thermodynamics?

The Second Law of Thermodynamics states that the total entropy of a closed system (one that can exchange energy but not matter with its surroundings) will always increase over time. This is because energy tends to spread out and become more evenly distributed, leading to an increase in disorder or entropy.

Can Entropy be created or destroyed?

No, entropy cannot be created or destroyed. It can only be transferred from one system to another. This is known as the principle of conservation of entropy.

Back
Top