Is Entropy the inexorable conversion of potential to kinetic energy?

In summary, the conversation discusses the concept of entropy and its relationship to potential and kinetic energy. Two important branches of entropy, thermodynamics and statistical, are briefly explained. The conversation also addresses the idea of entropy as the "heat death of the universe" and its origins in Lord Kelvin's proposal. Ultimately, the consensus is that there is no example of an isolated process that does not result in an increase in entropy and a conversion of potential to kinetic energy.
  • #1
Tertius
58
10
I know the math behind these, and I'm happy to use more precise language if needed, I just wanted to get some input on this sweeping generalization that entropy is the conversion of potential to kinetic energy.
A brief summary of two important branches of entropy:
1) thermodynamics - the total heat released by any isolated process is always positive
2) statistical - the total number way to arrange any isolated system always increases

Thermodynamically, there is no internal source of energy for any isolated system other than its own internal mass/potential energy. Mass decays and becomes partly kinetic, forces pull and push in such a way to maximize kinetic over potential energy. There is no version of any physical process where the final potential energy of an isolated system is greater than when it started.

Statistically, this is just a different view of the same concept. The number of ways you can configure a system depends on the energy states available to the constituents of the system. That means, as you add kinetic energy (as in any real, physical process), you will inevitably increase the number of energy states available to the particles and have a much larger value for the total number of possible states.

These two definitions seems to strongly suggest that potential energy => kinetic energy is the process of entropy. If any isolated system of any kind were to emerge from a real physical process with greater potential energy than it started with, it would violate entropy.

This also suggests that any system that no longer has mass/potential energy to be converted into kinetic energy has reached its maximal state of entropy. I.e. a bag of photons can't change its entropy, because it is already at a maximum.

Thoughts?
 
Science news on Phys.org
  • #2
Tertius said:
entropy is the conversion of potential to kinetic energy.
Nope.

Now, generally speaking, the number of available kinetic-energy microstates is often very large, so for many systems converting potential energy into kinetic energy results in an increase in entropy. So entropy can explain the conversion of potential to kinetic energy, but you can't say that entropy is that conversion.
 
  • #3
Thanks for the reply.

There is no example of an isolated process that does not end with exactly as much increased kinetic energy as decreased potential energy (including mass). The entropy increase will be exactly proportional to the gain of the kinetic energy of the constituents of the system. This is true regardless of the starting state of the system, as long as the starting state has greater than 0 potential energy. Thus, any system without potential energy (including mass) can no longer increase in entropy (i.e. a box of photons).

I am not saying we need to change any definitions, just specifically if there is an example that goes against my statement above.
 
  • #4
@Tertius - An ideal swinging pendulum does not have an oscillating entropy, at least I don't think it can be said to. Would you argue otherwise?
 
  • #5
@Grinkle - I think you are right. I analyzed it briefly both ideally and non-ideally.

Ideal - there is no entropy, it has a perfect string with no internal friction and no air resistance. So, yes the entropy would not oscillate with the pendulum because it remains constant.

Non-ideal - the string has internal friction and there is air resistance. Some of the potential energy in each swing is diverted from becoming kinetic movement of the pendulum to becoming kinetic movement of the air or internal heat in the string. So, there would be a consistent increase of entropy, as expected on each swing. This would lead to a damping effect where the pendulum would slow down, eventually stopping and having no potential energy. But, there would an internal potential/kinetic oscillation that would persist as the entropy steadily increases, and the entropy would not oscillate with it. So, yes, I believe your example is what I was looking for.

Thanks for the insight
 
  • #7
Tertius said:
Thanks for the reply.

There is no example of an isolated process that does not end with exactly as much increased kinetic energy as decreased potential energy (including mass). The entropy increase will be exactly proportional to the gain of the kinetic energy of the constituents of the system. This is true regardless of the starting state of the system, as long as the starting state has greater than 0 potential energy. Thus, any system without potential energy (including mass) can no longer increase in entropy (i.e. a box of photons).

I am not saying we need to change any definitions, just specifically if there is an example that goes against my statement above.
I have lot of examples:

1: A flywheel powered toy moves without friction uphill, until stops.
(Macroscopic kinetic energy becomes macroscopic potential energy)

2: A flywheel powered toy moves, until stopped by friction.
(Macroscopic kinetic energy becomes microscopic kinetic energy and microscopic potential energy)

3: A gas spring powered toy moves uphill.
(Microscopic kinetic energy becomes macroscopic potential energy)

4: A gas spring powered toy moves, until stopped by friction.
(Microscopic kinetic energy becomes microscopic kinetic energy and microscopic potential energy)

5: A gas spring powered toy accelerates, then moves without friction.
(Microscopic kinetic energy becomes macroscopic kinetic energy)
If matter was matter no matter how much you zoom into the matter, then all my examples mentioning microscopic energy would work very differently - more like according to your idea, maybe.
 
Last edited:
  • Like
Likes Tertius
  • #8
These are clever examples. I think your final statement is correct.

However, who put in the work to get the flywheel started or the gas pressurized in each of these cases? In each case, we will find that the initial potential energy was always greater than the final potential energy. And that difference is proportional to the entropy change. Because all of the potential energy lost will have become kinetic (cue the heat death..)

In the case of the flywheel specifically, if we consider it to be kinetic energy, there was some type of potential energy in the background that got it started. For example, the chemical processes required to run our muscular system, or a battery and a motor, etc...
 
  • #9
What about this (a thought experiment):
You have an empty container that can be closed and completely isolated from the surroundings.
You inject a stream of particles into this container and then close off the container.
Now the system is isolated and the actual experiment starts, I would say.
During injection the system was not isolated.

The particles constitute an ideal gas, so no potential energy and no forces other than elastic collisions.
Initially the distribution of velocities is quite narrow, but due to collisions you end up with the Boltzmann distribution after a while and this process actually maximizes entropy.

So neither potential (doesn't exist) nor total kinetic energy of the system has changed, but entropy has increased.
 
  • #10
This is interesting. If the particles are injected into the chamber at their particular velocities, the total Q contained would remain constant, so thermodynamically the injected system would compute the same total heat (total kinetic energy of particles would be conserved), and thus the same entropy. Statistically, the entropy would appear to increase as the energy is spread more evenly through the particles, but this would really just be saying it isn't equilibrated yet. As it equilibrates, the number of available microstates comprising the macrostate would increase until it reaches a maximal value.
Thermodynamically, you would immediately compute dQ/T to be the maximal entropy value.
Statistically, the system would approach the same value as the thermodynamic answer once it were equilibrated.
 
  • #11
@Tertius I think considering entropy for a system (or some part of a system) that is not at equilibrium can be tricky.

Look here -

https://en.wikipedia.org/wiki/Non-equilibrium_thermodynamics

A snip -

"Another fundamental and very important difference is the difficulty or impossibility, in general, in defining entropy at an instant of time in macroscopic terms for systems not in thermodynamic equilibrium; it can be done, to useful approximation, only in carefully chosen special cases, namely those that are throughout in local thermodynamic equilibrium."
 
  • Like
Likes Tertius
  • #12
Tertius said:
This is interesting. If the particles are injected into the chamber at their particular velocities, the total Q contained would remain constant, so thermodynamically the injected system would compute the same total heat (total kinetic energy of particles would be conserved), and thus the same entropy. Statistically, the entropy would appear to increase as the energy is spread more evenly through the particles, but this would really just be saying it isn't equilibrated yet. As it equilibrates, the number of available microstates comprising the macrostate would increase until it reaches a maximal value.
Thermodynamically, you would immediately compute dQ/T to be the maximal entropy value.
Statistically, the system would approach the same value as the thermodynamic answer once it were equilibrated.
I have to make the following comments: Q is heat transferred. It's not something contained in the gas and it's not a state function. You can see that if you compress an ideal gas reversibly at constant temperature. You do work on the gas and the gas gives off heat. The only thing that remains constant is the inner energy U.

What stays the same in my thought experiment is also the inner energy U, which is the total kinetic energy and proportional to the temperature for an ideal gas in equilibrium.
This does not mean that entropy stays the same!

dQ / T is the change of entropy of the gas when the heat dQ is transferred to the system at temperature T in a reversible process. It is not the maximal entropy value.

Now about entropy: I do realize that the gas is initially not in equilibrium and I would not try to calculate its entropy. Here I'm also answering Grinkle.
My argument is the following: I've defined the system as completely isolated straight after injecting the gas atoms. Now the gas atoms are left to themselves at constant inner energy.
This is the basis of the microcanonical derivation of statistical physics.
Eventually the atoms will arrive at the Maxwell Boltzmann distribution of velocities (if the gas is sufficiently dilute and T is high enough) and this distribution is characterized by a maximum value for entropy (due to a maximum value of microstates) as you correctly point out and discuss.
Notice that I don't need to calculate entropy for the gas while it is not in equilibrium in order to know that entropy increases when the gas approaches equilibrium (since entropy has to reach a maximum in equilibrium).

I would say that both thermodynamic and statistical entropy increase in my example.
 
Last edited:
  • #13
Good point, I should have been more accurate in discussing Q. It is a path dependent variable, not a state variable, as you said.
In your scenario (isolating the system after injection), there is no external work and there is no potential energy of any kind because they are perfectly elastic particles without any electrostatic interactions (and no internal radioactive decay). So, the only thing that can happen is the system will relax to its Boltzmann distribution of velocities.
Before it is at equilibrium, we would use Gibbs entropy to compute the statistical entropy. As it approaches equilibrium, the Gibbs entropy value will steadily increase until it is equal to the Boltzmann entropy (which is computed only at equilibrium). So statistically the entropy would appear to increase.
Thermodynamically, I don't know of a way to compute entropy not at equilibrium. So, to accurately determine the thermodynamic entropy of this system, we would need to follow its path through heating and injection, rather than determining it isolated at injection. We would then see that some amount of potential energy was used to heat and inject the particles, which is where they obtained their kinetic energy.

Now for the dangerous not at equilibrium discussion: this system has a fixed total internal energy that cannot change because there is no external work and no internal reactions happening to change it. Since the volume of the container is fixed, the change in internal energy is equal to qv, which is the heat change at constant volume. Since internal energy does not change, qv is zero, and thus the entropy change thermodynamically is also zero.
Basically, to compute the thermodynamic entropy of this system, this formula can be used at equilibrium:
1634865999632.png

None of these required changes can occur in this system, dE, dV, or dN.

Do you know of a way to compute entropy thermodynamically for such a system before it reaches equilibrium? That would be very interesting.
 
  • #14
Tertius said:
Thermodynamically, I don't know of a way to compute entropy not at equilibrium. So, to accurately determine the thermodynamic entropy of this system, we would need to follow its path through heating and injection, rather than determining it isolated at injection. We would then see that some amount of potential energy was used to heat and inject the particles, which is where they obtained their kinetic energy.

Now for the dangerous not at equilibrium discussion: this system has a fixed total internal energy that cannot change because there is no external work and no internal reactions happening to change it. Since the volume of the container is fixed, the change in internal energy is equal to qv, which is the heat change at constant volume. Since internal energy does not change, qv is zero, and thus the entropy change thermodynamically is also zero.
Basically, to compute the thermodynamic entropy of this system, this formula can be used at equilibrium:
View attachment 290998
None of these required changes can occur in this system, dE, dV, or dN.

Do you know of a way to compute entropy thermodynamically for such a system before it reaches equilibrium? That would be very interesting.
I would normally only calculate entropy changes in thermodynamics and to do that I need an initial and a final equilibrium state. Then I also need some reversible path between the states. (note: The system doesn't have to take this path.)

The fundamental equation that you give will not be useful because T is not defined for non-equilibrium states, so you can't really conclude anything from it concerning this process.

My example was only intended to show that entropy can increase even if the total kinetic energy in an isolated system doesn't change. What happened before the system was isolated doesn't matter for the argument, I would say.
 
  • #15
I agree, that equation only holds for equilibrium. It seems that it is not possible to compute the entropy of your example thermodynamically. Statistically yes I would agree it is an example of how the actual process of equilibration has an attending increase in entropy.
But that situation would only happen if we decide the system is isolated after heating and injection.
I believe the reason it can only be computed statistically and not thermodynamically is because it is not a complete process. It is only the last half. Where the particles came from and what heated them would certainly give us a way to calculate the thermodynamic entropy which would be equal to the Boltzmann entropy.
 
  • Like
Likes Philip Koeck
  • #16
Tertius said:
I agree, that equation only holds for equilibrium. It seems that it is not possible to compute the entropy of your example thermodynamically. Statistically yes I would agree it is an example of how the actual process of equilibration has an attending increase in entropy.
But that situation would only happen if we decide the system is isolated after heating and injection.
I believe the reason it can only be computed statistically and not thermodynamically is because it is not a complete process. It is only the last half. Where the particles came from and what heated them would certainly give us a way to calculate the thermodynamic entropy which would be equal to the Boltzmann entropy.
It might be interesting to limit your original question to processes where the initial and final state are equilibrium states.
 
  • #17
Hi. I'm 74 and I've never been on a forum like this. Here goes...

What you are discussing is heat energy and more generally, a theory of heat. Instead of dividing a systems energy into kinetic and potential energy, I would ask you to consider the possibility that heat energy exists in only two domains:
  • Heat of thermal motion (mass domain energy), …think Maxwell-Boltzmann velocity distribution for an ideal gas at an equilibrium temperature,
    1635707507521.png
    , (incidentally, it was Helmholtz great contributions to physics that radiant heat energy decays to motion and that this is just another form of heat energy. (ref Planck's lectures pg 106)
  • Heat radiation (radiation domain energy …Planck’s equilibrium blackbody spectrum, again at an equilibrium temperature,
    1635707507521.png
    .
I want you to understand that the movement of heat energy between these two domains, and in either direction, is seamless as the system moves toward an equilibrium. Consider this thought experiment. Heat two identical objects one by exposure to sun light; the other by friction. If the same heat energy content is added to both as closed systems, the final maximum entropy state in each will be identical, with no indication of the method used to heat each object initially. The First Law requires that the total energy content of each closed system is constant. The second law requires the evolution of both closed systems to a final maximum entropy state in both the mass domain,
1635707507568.png
, and the radiation domain,
1635707507641.png
within each closed system, …and also…a maximum entropy state between these two domains,
1635707507725.png
, within each closed system. The final equilibrium state in both cases is identical, or at least, their quasi-equilibrium state is always moving toward identical states.

Let me bring this back to your discussion of kinetic energy. Consider again a closed system. It contains mass domain kinetic energy and radiation domain blackbody energy. This system’s mass domain energy is entirely kinetic. It is always in a quasi-equilibrium with the heat energy in the radiation domain,
1635707507777.png
and its equilibrium blackbody spectral distribution. What I want you to see is how mass is an integral part of kinetic energy, whereas radiation domain energy is mass independent. It can exist in the vacuum of space, in the absence of mass domain’s thermal motion. Examples are cosmic background radiation, or light energy traveling from a distant galaxy. If you want to discuss the entropy dynamics of this system, I’d be pleased to oblige.

I'll try this once...
 

Attachments

  • 1635706668247.png
    1635706668247.png
    542 bytes · Views: 113
  • 1635706874897.png
    1635706874897.png
    363 bytes · Views: 101
  • 1635707027173.png
    1635707027173.png
    363 bytes · Views: 103
  • #18
I agree that heat comes in those two forms, the kinetic energy of massive particles and light. Would you say that light is not also kinetic energy? I would argue that is it, because it can exert momentum on its surroundings.

My main argument is simply: from cradle to grave, the universe is losing potential energy (mass, chemical potential, etc.) and becoming more kinetic (movement and light). This appears to be the main consequence of entropy in the universe. No system, followed through a complete transformation, exits with more potential energy (or mass) than it started with. This is the same process the universe as a whole is experiencing.

Hopefully my motivation is clear for defining it in those terms.
 

FAQ: Is Entropy the inexorable conversion of potential to kinetic energy?

What is entropy?

Entropy is a measure of the disorder or randomness in a system. It is a concept in thermodynamics that describes the amount of energy that is unavailable for work in a system.

How does entropy relate to potential and kinetic energy?

Entropy is the measure of the conversion of potential energy to kinetic energy. As energy is transferred and transformed within a system, the amount of potential energy decreases while the amount of kinetic energy increases, resulting in an overall increase in entropy.

Is entropy always increasing?

According to the second law of thermodynamics, the total entropy of a closed system will always increase over time. This means that the amount of disorder or randomness in a system will always increase, and the potential energy will continue to be converted into kinetic energy.

How does entropy affect the universe?

The increase in entropy over time has a profound impact on the universe. As the universe ages, the amount of usable energy decreases, leading to a gradual decrease in the ability to do work. This is known as the heat death of the universe, where all energy will eventually be evenly distributed and no work can be done.

Can entropy be reversed?

While the overall trend of entropy is to increase, there are certain processes that can temporarily reverse it. For example, a refrigerator uses energy to decrease the entropy of its contents, creating a more ordered system. However, this ultimately results in an overall increase in entropy in the universe.

Back
Top