Physics: Reconciling Entropy & Energy

In summary: The energy of an isolated system is constant. Things spontaneously evolve toward higher entropy.Some systems remain at constant entropy, however the energy may be minimised.The energy of an isolated system is constant.When two systems are in contact, the temperature moves towards equilibrium between them.The universe is pretty cold (~2.7K), so if the system is warmer than that, energy will flow out of the system to increase entropy.Since the heat capacity of the universe is unfathomably large, this has no noticeable effect on the temperature of the universe.Thanks Kashishi, that helps a bit to look at some of the systems from a larger perspective.
  • #1
dm4b
363
4
I'm having a hard time reconciling two ideas in physics.

One is that systems tend towards the maximum amount of disorder, or "entropy always increases". And, when two systems are brought together they are likely to be found at the energy E* which maximizes the number of states of the combined system. However, it would appear that E* is not necessarily the minimum energy of the combined system. This is where I am hung up a bit.

In other areas of physics, systems seem to always tend towards the least energy state. An electron cascading back to the ground state of a hydrogen atom, which is 13.6 eV, the state with the least energy, or largest negative bound energy. Or, the shape a soap film takes. Etc.

Perhaps there is no conflict here, I guess it just struck me as strange.
 
Science news on Phys.org
  • #2
Things spontaneously evolve toward higher entropy. It only seems like they evolve toward lower energy because the energy that escapes can increase the entropy elsewhere. Since you don't know where the energy goes, the entropy increase is very high. In an open system, you should use an appropriate "free energy" to calculate whether a reaction will be spontaneous.

In a closed system, it makes no sense to evolve toward minimum energy, because energy is conserved.
 
  • #3
Roughly speaking, large systems have more degrees of freedom than small ones, so they have greater heat capacity. The universe is unfathomably large compared to any experimental system, so when energy escapes the system, the increase in entropy of the universe greatly outweighs the decrease in entropy of the system.

When two systems are in contact, the temperature moves toward equilibrium between them. The universe is pretty cold (~2.7K), so if the system is warmer than that, energy will flow out of the system to increase entropy. Since the heat capacity of the universe is unfathomably large, this has no noticeable effect on the temperature of the universe.
 
  • #4
Thanks Kashishi, that helps a bit to look at some of the systems from a larger perspective.
 
  • #5
What is your interest in thermodynamics?
You seem to be mixing up a number of different ideas.

For instance

dm4bIn other areas of physics, systems seem to always tend towards the least energy state.

The energy of an isolated system is constant.

Kashishi
Things spontaneously evolve toward higher entropy.

Some systems remain at constant entropy, however the energy may be minimised.
 
  • #6
Studiot said:
The energy of an isolated system is constant.

I wasn't necessarily talking about isolated systems, but let's go to that anyhow and maybe we'll see where I am really mxing things up ;-)

Let's take two separate systems for comparison that will be put inside a "box" (containing only a vacuum) that effectively isolates the systems from the external environment.

System 1

This is as outlined on page 11 of David Tong's Statistical Physics Lecture Notes

http://www.damtp.cam.ac.uk/user/tong/statphys.html

... a system of N non-interacting particles. Each particle is fixed in position and
can sit in one of two possible states which, for convenience, we will call spin up
and spin down

E_spindown = 0; E_spinup = epsilon.


Again, these N interacting particles are isolated within the box.

As you read through his example, he eventually shows the maximum entropy is:

N*epsilon/2, which is not the minimum energy.

**Would this system, when isolated, eventually be found at this energy level which maximizes entropy? I'm assuming ... no.

**Presumably, energy conservation keep it at whatever energy level the N particles were in upon isolation within the box. Then, why wouldn't some of the particles (which would seemingly prefer to be in the spin down state (where E=0)) not flip spins?

**If the system needs to be in contact with the external environment (i.e. The Universe) to do anything, what specific interactions with the environment help it achieve the state of maximum entropy calculated in Tong's paper? Tong seems to imply any interaction that raises the temperature will do the trick.

System 2

A single hydrogen atom in the first excited state placed within the box - that's it. (We're assuming we're so fast we can get it into the box before it cascades to the ground state!)

The electron will cascade back to the ground state emitting a photon.

**Although the hydrogen atom will tend towards the least energy state (the ground state) does the emission of the photon (the only one found in the box) increase the overall entropy? (Since the photon energy equals the energy difference between the ground state and first excited state, energy is presumably conserved here)

Hopefully, these examples will help elucidate some of the behavior I am trying to clear up for myself.
 
Last edited:
  • #7
I will have time to read Tong's paper tomorrow, if I can navigate to page 11, which I can't seem to do at the moment.

>Meanwhile your quote says

a system of N non-interacting particles.

which you change to

these N interacting particles

If they are non interacting where does the energy go if a particle changes state from spinup to spindown?

Edit
I have now found the original and I assume my comment above refers to a typo since Tong's particles are non-interacting.

However Tong's system can never maximise or minimise energy since he talks about adding (and by implication subtracting) energy.

/Edit

Moving on to your second example

So the hydrogen atom tends to minimum energy.

So what? You did not define it to be the whole system, which still includes the expelled photon, which has energy. Does the energy of the entire system tend to a minimum?

The important thing is to be clear on some fundamentals.

1)You have to define the system and you can't change horses part way through the analysis.

2)You have to define the system process.

3)You have to define the system boundary, which can be difficult for an infinite system.

With these definitions you can categorise open, closed, isolated systems and calibrate miumum and maximum entropy and energy.

Many difficulties come about because of poor choice in (1), (2) or (3) and disappear once a sound definition is made.

edit
Quote by Studiot
The energy of an isolated system is constant.

I wasn't necessarily talking about isolated systems,


Yes but you were talking about minimising system energy v maximising system entropy.
What does this mean for a non isolated system?

/edit
 
Last edited:
  • #8
Studiot said:
The important thing is to be clear on some fundamentals.

1)You have to define the system and you can't change horses part way through the analysis.

2)You have to define the system process.

3)You have to define the system boundary, which can be difficult for an infinite system.

With these definitions you can categorise open, closed, isolated systems and calibrate miumum and maximum entropy and energy.

Many difficulties come about because of poor choice in (1), (2) or (3) and disappear once a sound definition is made.
First, I thought I was fairly clear on the systems - that they are closed/isolated. And, I never said they were infinite. That's what the "box" was for - to clearly delineate the boundary and effectively block the external environment.

If there is a better way to set these up, I'm all ears, though.

Studiot said:
I will have time to read Tong's paper tomorrow, if I can navigate to page 11, which I can't seem to do at the moment.

>Meanwhile your quote sayswhich you change to If they are non interacting where does the energy go if a particle changes state from spinup to spindown?

Edit
I have now found the original and I assume my comment above refers to a typo since Tong's particles are non-interacting.

However Tong's system can never maximise or minimise energy since he talks about adding (and by implication subtracting) energy.

/Edit

Yes that was a typo on my part.

But, it seems like you didn't read my post very carefully.

I said I assumed the energy would not, or rather could not, change. I italicized the real question I had there, which was:

Why would the individual particles that were found in the spin up state (E=epsilon) not tend towards to a minimum energy configuration of spin down (E=0) for this setup? I was seeking clarification on that.

I also specified that Tong seems to indicate that if the system was not isolated, or closed, that any change in the ambient temperature would change the energy and entropy of the system. However, by this, he seems to imply that the system would not evolve on it's own (while isolated and closed). I was seeking clarification on this, as well.

Studiot said:
Moving on to your second example

So the hydrogen atom tends to minimum energy.

So what? You did not define it to be the whole system, which still includes the expelled photon, which has energy. Does the energy of the entire system tend to a minimum?
Again, it seems like you did not read my post very carefully, or perhaps I'm not explaining things well.

I stated that energy seems to be conserved under the emission of the photon and state transition. Is that correct? If so, my question was does that state transition (with accompanying photon emission) raise the entropy of this closed and isolated system?
 
  • #9
I don't know whether this helping or not, but at least I tried . .

you have a small bar (1 kg) of hot copper (100 deg C) and cool water (100 kg) (25 deg C). then you combine . . (you know m[itex]_{1}[/itex]Cp[itex]_{1}[/itex]ΔT[itex]_{1}[/itex] = m[itex]_{2}[/itex]Cp[itex]_{2}[/itex]ΔT[itex]_{2}[/itex], you can browse for each Cp's)

assuming there's no heat loss, you calculate final temperature, you calculate entropy change between of the copper and water each (the copper should be negative and water positive) then you go sum them, you'll get positive . .

there was no heat loss, means the total energy is conserved, but total the entropy change is positive . . that means, where there's a heat transfer, there might be a positive total entropy change, and if it's, the process is irrevesible . .

this is practical, but philosophically, it's umm . . i don't know how to put it in english, umm . . randomization movement maybe, you go imagine if molecules heated, the movement goes even more random, so the entropy goes towards at higher state . . I don't know . .

I hope this helps . . ^^
 
  • #10
I think I just got the answer on what I was trying to get at with all this. It comes from the Quantum Field Theory book I am studying - Peskin and Schroeder, page 364-365, Chapter 11.

At zero temperature the thermodynamic ground state is the state of lowest energy, but at non-zero temperature we still have a geometrical picture of the preferred thermodynamic state: It is the state that minimizes the Gibb's Free Energy ... The thermodynamically most stable state is the minimum of G

"Nature is thrifty", as someone once said, and seems to operate under these over-arching principles like Least Action, etc. I was trying to figure out if there was something like that here too, and I believe this is what I was looking for.

My stupid example systems probably weren't helping.

If anybody has any gems of wisdom related to all this, would be interesting to hear.
 
  • #11
I note in Tong's paper he recommends the Oxford University classic

Classical Thermodynamics by Pippard.

This really is a thoughtful book. In it the author examines the interesting question:

Consider the irreversible expansion of gas held in a chamber at V1 into a larger adjacent chamber containing a vacuum so it then occupies a volume V2.
It possesses entropy S1

Following the expansion the gas now occupies the enlarged volume V2.

As a result the entropy has increased to S2

Now the molecules of gas move about, separate and clump together, forming local increases and decreases in concentration.

There is a statistical chance, however small, that all the original molecules will find themselves in a clump in the original volume.

What is the new entropy of this situation?
 
  • #12
Studiot said:
I note in Tong's paper he recommends the Oxford University classic

Classical Thermodynamics by Pippard.

This really is a thoughtful book.

Thanks for the recommendation. I was already wondering where to go next after Tong's lecture notes. I'll have to check this one out.

Studiot said:
In it the author examines the interesting question:
Consider the irreversible expansion of gas held in a chamber at V1 into a larger adjacent chamber containing a vacuum so it then occupies a volume V2.
It possesses entropy S1

Following the expansion the gas now occupies the enlarged volume V2.

As a result the entropy has increased to S2

Now the molecules of gas move about, separate and clump together, forming local increases and decreases in concentration.

There is a statistical chance, however small, that all the original molecules will find themselves in a clump in the original volume.

What is the new entropy of this situation?

I'm glad you brought this up, because I was just thinking about this very thing and had some questions on it.

Tong mentions how volume and entropy are extensive quantities and therefore scale as:

S(λE,λV,λN) = λS(E,V,N)

So, it makes sense that as you stated V1 would have S1 and V2 would have S2

Since, the molecules all clumped in the original volume is still actually only constrained by V2, I was guessing that the average (<S2>) entropy wouldn't actually change.

This would just be a (very, very unlikely) fluctuation and the molecules would quickly go back to a configuration that maximizes the number of states for volume V2, with entropy S2.

I was wondering if it would be correct to say the average entropy <S2> stays the same, but the system will experience fluctuations around that value. Really, really unlikely fluctuations like the one above would be exceedingly rare - like happening on time scales longer than the life of the Universe.

Not sure on any of that, though. Just guessing ;-)
 
  • #13
Bump.

So, nobody knows the answer to Studiot's question in post #11?
 
  • #14
Yes your thinking is pretty much on the button.

The entropy in the second situation is still S2.

Now how about this situation.

Consider two isolated chambers at different temperatures. The chambers are separated by a removable adiabatic wall that does not allow the passage of work or matter.

The wall is temporarily removed and heat, q, flows from the hotter chamber at T1 to the cooler at T2.

Then the wall is replaced.

So you again have two isolated chambers.

In one the entropy has increased by q/T2 and in the other decreased by q/T1.

So the hotter chamber is an isolated system in which the entropy has been reduced.

Comments?
 
  • #15
Studiot said:
Yes your thinking is pretty much on the button.

The entropy in the second situation is still S2.

Now how about this situation.

Consider two isolated chambers at different temperatures. The chambers are separated by a removable adiabatic wall that does not allow the passage of work or matter.

The wall is temporarily removed and heat, q, flows from the hotter chamber at T1 to the cooler at T2.

Then the wall is replaced.

So you again have two isolated chambers.

In one the entropy has increased by q/T2 and in the other decreased by q/T1.

So the hotter chamber is an isolated system in which the entropy has been reduced.

Comments?

Well, here's my best guess again.

Once the wall is removed, the two systems are in contact and will tend towards an equilibrium where eventually we'll have T1=T2. The states of the combined systems will be maximized, or the entropy will increase.

Now, if we put the wall back in after equilibrium is achieved, no biggy.

If we put the wall back in before equilibrium is achieved, that's still okay. We basically interrupted the equilibrium process before it was done. The "hotter chamber" system which is now isolated again, is not the same system it was before. The two systems would have mixed some as they were in the middle of achieving chemical equilibirum, as well. And, the overall entropy (of both systems combined) is still increased.
 
  • #16
Again your thinking is good.

The object of these is to highlight the complications that can arise and how far reaching the conventional statements of the Laws are and how difficult it is to produce alternatives that cover as much ground.

There was a thread here about the situation where one cannot determine equilibrium from maximum entropy. In this that case another minimum energy criterion is required.

Post#7 here
https://www.physicsforums.com/showthread.php?t=647707&highlight=maximum+entropy&page=2
 
  • #17
Studiot said:
Again your thinking is good.

The object of these is to highlight the complications that can arise and how far reaching the conventional statements of the Laws are and how difficult it is to produce alternatives that cover as much ground.

There was a thread here about the situation where one cannot determine equilibrium from maximum entropy. In this that case another minimum energy criterion is required.

Post#7 here
https://www.physicsforums.com/showthread.php?t=647707&highlight=maximum+entropy&page=2


They make fun thought problems too!

I've been reading QFT and QM and some GR rather regularly for a while, but I haven't looked at much Statistical Mechanics since school about 15 years. I've been enjoying reading it again.


I'll check out that thread later, thanks. The title sounds interesting too, as I have been having some questions on the Free Energy too.
 
  • #18
Here's another simple way to think about it. An open system doesn't actually evolve to minimum energy. Actually, it evolves toward a temperature which matches the temperature of the surroundings. In deep space, the temperature of the surroundings is about 2.7K, which is very cold. Most systems we care about have much higher temperature, so heat leaks out. So, this is why it looks like the system evolves towards minimum energy.

For comparison, an excited hydrogen atom (n=2) has an energy of 10.2 eV, which has a temperature equivalent of 118000K. On Earth, the hydrogen atom is constantly colliding with other atoms, so it will evolve toward thermal equilibrium, which is somewhere around 300K, so almost all the hydrogen atoms will go to the ground state (ignoring the obvious possibility of forming molecules).
 

FAQ: Physics: Reconciling Entropy & Energy

1. How do entropy and energy relate to each other in physics?

Entropy and energy are closely related concepts in physics. Entropy is a measure of the disorder or randomness in a system, while energy is the ability to do work. In most cases, an increase in entropy leads to a decrease in usable energy, and vice versa.

2. Why is the concept of entropy important in understanding thermodynamics?

In thermodynamics, entropy is a fundamental concept that helps us understand the direction of energy transfer and the efficiency of energy conversion. It allows us to predict the behavior of systems and determine the most likely outcomes.

3. Can energy be created or destroyed?

According to the law of conservation of energy, energy cannot be created or destroyed, but it can be transformed from one form to another. This means that the total amount of energy in a closed system remains constant.

4. How does the second law of thermodynamics relate to entropy and energy?

The second law of thermodynamics states that the total entropy of a closed system always increases over time. This means that usable energy is constantly decreasing and becoming more dispersed, leading to a decrease in the system's ability to do work.

5. Can entropy be reversed or decreased?

Theoretically, it is possible to reverse or decrease entropy in a local system, but this would require an input of energy from an external source. In a closed system, entropy will always increase over time, and it is impossible to completely reverse or decrease it.

Back
Top