Entropy Reversal: Proven by Physics and Science

  • Thread starter antonima
  • Start date
  • Tags
    Entropy
In summary, the conversation discusses the possibility of entropy reversal in a closed system, which is supported by the fluctuation theorem. However, some argue that this contradicts the second law of thermodynamics. The concept of entropy is also explained and how ordered systems can emerge spontaneously in contact with highly disordered systems. Different examples, such as a particle suspended in a liquid and rolling a loaded dice, are used to illustrate these ideas. Ultimately, the conversation highlights the need to consider the entire system when discussing entropy.
  • #1
antonima
28
0
I would like to make a stand regarding the topic of entropy reversal. Entropy CAN in fact be reduced in a closed system, and this happens spontaneously according to the fluctuation theorem. Its been published in a well known scientific journal over a decade ago, and beforehand has been established by many notable physicist.

Take for instance the case of a particle suspended in a liquid. The particle will rise and fall in the liquid column thus gaining and losing energy, which comes from thermal energy of brownian motion. The system will cool when the particle rises and it will warm up when the particle descends. No matter how large the number of particles is, at equilibrium there is an equal chance that the system's entropy will increase and decrease/ system will heat or cool.
 
Science news on Phys.org
  • #2
The biggest problem I see is that your definition is flawed.

If your "system" does not include the particle, then you have 2 open systems--one being the particle, the other being its surrounding/the rest of the liquid. All they're doing is exchanging entropy back and forth.

If your system includes the particle + the rest of the liquid, then in this closed system the total entropy would not decrease.

In either case, entropy of a closed system does not decrease. I doubt any well-known scientific journal or notable physicist would have overlooked such contradiction in definition.
 
Last edited:
  • #3
danmay said:
I doubt any well-known scientific journal or notable physicist would have overlooked such contradiction in definition.

How about Jean Perrin? The guy who got a nobel prize for validating Einstein's work experimentally? It was over 100 years ago when he basically used the very same example in his book 'molecular nature of heat'. This was only a few years after brownian motion was discovered by a botanist

danmay said:
In either case, entropy of a closed system does not decrease.

It does though, that is what fluctuation theorem is.


danmay said:
If your system includes the particle + the rest of the liquid, then in this closed system the total entropy would not decrease.

Yes, the particle + the rest of the liquid. The entropy does decrease because high entropy energy (thermal energy of liquid) becomes transformed into low entropy energy (gravitational potential of particle).
 
  • #4
Before trudging into muddy waters I'd suggest reading Feynman's Lectures on Physics, chapters 44,45 and 46.
 
  • #5
antonima said:
I would like to make a stand regarding the topic of entropy reversal. Entropy CAN in fact be reduced in a closed system, and this happens spontaneously according to the fluctuation theorem.
Yes. The 2nd law is statistical in nature and so there can be small, local fluctuations where entropy momentarily decreases.
http://en.wikipedia.org/wiki/Fluctuation_theorem

So what? Why do you need to "make a stand" over this? What is your point? It isn't your intention to restart an argument you created 8 months ago, is it?
 
  • #6
There are even simpler examples that are quite central to thermodynamics-- for example, put a two-level atom, with energy difference E, in contact with a thermal reservoir at T. The atom has an eE/kT higher probability of being in the lower level than the upper level, because the reservoir has that same factor more states available if it keeps the energy E rather than gives it up. So that's what sets the probability of exciting the atom, the fact that the reservoir has a higher probability of being in a state of higher entropy. That's really all the second law ever meant.
 
  • #7
A friend of mine always argued that "we" are evidence of significant decrease in entropy. The fact that humans and animals, plants, etc. can arise from the universe seemingly spontaneously whilst the universe around us is null can be interpreted this way. (we are composed of a very organized structure)

Another notion (not by myself) along these lines can be stated that if the entropy in one part of the universe decreases (say the Earth) the entropy in another part must increase.
This argument is even more controversial as the implication is that the state of entropy in the universe is somewhat constant, and when entropy increases in one part of the universe, it must decrease in another part. This would mean that there must be much more life out there then most of us think.

Interesting notions.

Cheers,
 
  • #8
antonima said:
Yes, the particle + the rest of the liquid. The entropy does decrease because high entropy energy (thermal energy of liquid) becomes transformed into low entropy energy (gravitational potential of particle).
Here is an example of the fluctuation theorem: two otherwise isolated bodies are in contact; one being warmer. There is some probability, depending on how long it lasts, that energy would flow from the cooler side to the warmer side. In other words, the closed system transitions from a more entropic occupation of microstates (i.e. distribution of energy) into a less entropic distribution of microstates.
 
Last edited:
  • #9
FeX32 said:
A friend of mine always argued that "we" are evidence of significant decrease in entropy. The fact that humans and animals, plants, etc. can arise from the universe seemingly spontaneously whilst the universe around us is null can be interpreted this way. (we are composed of a very organized structure)
Your friend is making the single most common error in thermodynamics-- failing to account for the entire system. They do not understand entropy. It is completely consistent with thermodynamics that ordered systems emerge spontaneously, all that is required is the system be in contact with other systems that become very highly disordered. This principle plays out constantly, even in applications that have nothing at all to do with life or humans.

I'll give you a simple example-- rolling a loaded dice. A loaded dice has a weight in it, so it tends to always come up the same thing, because that let's the weight be lower (releasing gravitational energy as heat into the environment). Coming up the same thing every time is a highly ordered and low-entropy result, but it happens spontaneously (if, say, an earthquake shook the dice) because the rest of the system gains entropy (in the form of heat). This has nothing to do with life, it holds on a lifeless planet as much as it does on ours, but life is a kind of analogous equivalent to the "loaded dice."
 
  • #10
Gordianus said:
Before trudging into muddy waters I'd suggest reading Feynman's Lectures on Physics, chapters 44,45 and 46.

Unfortunately I do not own the Feynman Lectures. Does this by any chance have to do with Faynman's ratchet? If I understand right, Faynman proposed a brownian ratchet and then went back saying that the 'quantum nature' will not allow a brownian particle to move a ratchet, because the ratchet will be inherently too 'fuzzy'.

russ_watters said:
So what? Why do you need to "make a stand" over this? What is your point? It isn't your intention to restart an argument you created 8 months ago, is it?

It seems that a lot of people are not aware of the fluctuation theorem, which is a shame. It is always useful to be able to contradict a big law of physics with concrete evidence. :)

Ken G said:
There are even simpler examples that are quite central to thermodynamics-- for example, put a two-level atom, with energy difference E, in contact with a thermal reservoir at T. The atom has an eE/kT higher probability of being in the lower level than the upper level, because the reservoir has that same factor more states available if it keeps the energy E rather than gives it up. So that's what sets the probability of exciting the atom, the fact that the reservoir has a higher probability of being in a state of higher entropy. That's really all the second law ever meant.

I am not sure I understand. I know a little bit about energy levels in atoms, and it is true that lower energy levels are inherently more populated than higher energy levels. The equation you give shows the probability of the atom being at either energy state. What do you mean by 'the reservoir has that same factor more states available if it keeps the energy E rather than gives it up.'?

danmay said:
Here is an example of the fluctuation theorem: two otherwise isolated bodies are in contact; one being warmer. There is some probability, depending on how long it lasts, that energy would flow from the cooler side to the warmer side. In other words, the closed system transitions from a more entropic occupation of microstates (i.e. distribution of energy) into a less entropic distribution of microstates.

Yes. It doesn't happen often, but it does now and then. The question is whether or not the system can be monitored and stopped at the lower entropy state.
 
  • #11
antonima said:
Yes. It doesn't happen often, but it does now and then. The question is whether or not the system can be monitored and stopped at the lower entropy state.

Yes but monitoring the system would require a machine that gathers vasts amounts of data. This data would need to be eventually deleted and the process of deleting this data would generate heat restoring the entropy conditions . Thus your argument is flawed, as all computing machines generate heat when deleting bits of data and hence entropy is always increasing.

What you are describing is some type of perpetual motion system.
 
  • #12
jonlg_uk said:
Yes but monitoring the system would require a machine that gathers vasts amounts of data. This data would need to be eventually deleted and the process of deleting this data would generate heat restoring the entropy conditions . Thus your argument is flawed, as all computing machines generate heat when deleting bits of data and hence entropy is always increasing.

Entropy of information has been used to 'defeat' maxwell's demon, but I think that it is unimaginative to consider digital as the only method of data storage and monitoring. Data comparisons can be made and orders can be executed without storing any information at all, using physical systems. For instance, if we assume that the gas which maxwell's demon is sorting is an ionic gas/plasma, we know that particles will exert local electric and magnetic fields. These can be used in conjecture with a nano-diode that would bias the passage of ions in one direction only, without ever storing a bit of information!
 
  • #13
antonima said:
It seems that a lot of people are not aware of the fluctuation theorem, which is a shame. It is always useful to be able to contradict a big law of physics with concrete evidence. :)

Hmmm.

The Fluctuation Theorem does much more than merely prove that in large
systems observed for long periods of time, the Second Law is overwhelmingly
likely to be valid. The Fluctuation Theorem quantifies the probability of observing
Second Law violations in small systems observed for a short time.

- D.J. Evans & D.J. Searles (2002) "The Fluctuation Theorem." Advances in Physics Vol. 51, No. 7, 1529 - 1585.

I would say that the fluctuation theorem clarifies the second law of thermodynamics, rather than contradicts it. Even Denis Evans - you know, that Denis Evans - has this gem on his professional page:

We are known for deriving and experimentally confirming the Fluctuation Theorem.

This Theorem gives an elegant extension of the Second Law of Thermodynamics,

so that it applies to finite systems observed for finite times. It also provides the first proof of the Second Law of Thermodynamics - it ceases to be a "Law".

:cool:
 
  • #14
antonima said:
I am not sure I understand. I know a little bit about energy levels in atoms, and it is true that lower energy levels are inherently more populated than higher energy levels. The equation you give shows the probability of the atom being at either energy state. What do you mean by 'the reservoir has that same factor more states available if it keeps the energy E rather than gives it up.'?
That last statement is the reason that the atom is more likely to be in the lower level. It is purely a matter of counting states in the reservoir, and noting what factor is that number lowered by if the energy E goes from the reservoir to excite the atom. This is the meaning of temperature, in fact. My point is that the atom has only one state either way, so its entropy is zero either way, but the reservoir has higher entropy if it keeps energy E-- yet it will sometimes be found to give up that energy. This is because the second law is only a general rule of thumb that only graduates to a "law" when you have a vastly different number of states that you are contrasting, not just the eE/kT ratio that appears when you remove a tiny energy E from the reservoir. In the latter type of situation, i.e. using a vast reservoir to excite a single atom, no one in their right mind would expect the second law to apply. So we have to understand that law in order to use it properly.
Yes. It doesn't happen often, but it does now and then. The question is whether or not the system can be monitored and stopped at the lower entropy state.
But this is all very elementary, people who really understand thermodynamics are way past this kind of issue. It does bear repeating from time to time, to be sure, but it sounds like you are overinterpreting it, and putting too much stock into a relatively trivial application of what statistical mechanics is all about.
 
  • #15
antonima said:
<snip>

Yes. It doesn't happen often, but it does now and then. The question is whether or not the system can be monitored and stopped at the lower entropy state.

AFAIK, it cannot. Measurement and transmission of information is associated with a flow of entropy (or negentropy if you prefer). See, for example, "Maxwell's demon" and the solution to the Gibbs paradox.
 
  • #16
Mike H said:
I would say that the fluctuation theorem clarifies the second law of thermodynamics, rather than contradicts it. Even Denis Evans - you know, that Denis Evans - has this gem on his professional page:



:cool:

Yes!

I know that his experiment (the one with a bead in a laser beam) doesn't show an overall decrease in entropy, but rather momentery decreases in entropy of a macro-scale system. I don't think that it is impossible though, and I have a hunch that he could prove that as well if he really wanted to, although that might not get published.

Ken G said:
That last statement is the reason that the atom is more likely to be in the lower level. It is purely a matter of counting states in the reservoir, and noting what factor is that number lowered by if the energy E goes from the reservoir to excite the atom. This is the meaning of temperature, in fact. My point is that the atom has only one state either way, so its entropy is zero either way, but the reservoir has higher entropy if it keeps energy E-- yet it will sometimes be found to give up that energy. This is because the second law is only a general rule of thumb that only graduates to a "law" when you have a vastly different number of states that you are contrasting, not just the eE/kT ratio that appears when you remove a tiny energy E from the reservoir. In the latter type of situation, i.e. using a vast reservoir to excite a single atom, no one in their right mind would expect the second law to apply. So we have to understand that law in order to use it properly.

Okay, I think I see what you mean. That is exactly the point, on small time and energy scales there can be reversals. It is much like a liquid and the partial pressure of it's gas phase. This pressure is only constant when evaporationa and condensation are at an equilibrium, so a lot of small scale 'fluctuations' still occur when more molecules simultaneously evaporate than condense. It just doesn't affect the macro-scale properties of the gas.

Ken G said:
But this is all very elementary, people who really understand thermodynamics are way past this kind of issue.

Still, no one has made a machine that sucks up thermal energy and creats electricity. It seems that this fluctuation theorem would add credulence to its feasibility. Wouldn't you agree?

Andy Resnick said:
AFAIK, it cannot. Measurement and transmission of information is associated with a flow of entropy (or negentropy if you prefer). See, for example, "Maxwell's demon" and the solution to the Gibbs paradox.

I just don't buy into the 'entropy of information' idea. Sure, it is a real concept when working with digital systems. But if one for instance thinks of a scale tipping, the scale does not need any information regarding what it is holding in order to tip one way or the other. It just acts physically on forces exerted by gravity. This is an example of a system that operates without any information transfer, unless I do not wholly understand the concept.
 
  • #17
danmay said:
Here is an example of the fluctuation theorem: two otherwise isolated bodies are in contact; one being warmer. There is some probability, depending on how long it lasts, that energy would flow from the cooler side to the warmer side. In other words, the closed system transitions from a more entropic occupation of microstates (i.e. distribution of energy) into a less entropic distribution of microstates.

To clarify, it's more like a part of the cool side may be warmer than a part of the warm side; hence kinetic energy flows from that part of the cool side that's actually warmer to that part of the warm side that's actually cooler. This only applies to a part of the system. If considering the entire system or any system in entirety, kinetic energy does not flow from cooler to warmer.

It may still be possible for entropy to decrease, but it doesn't last long. Most of the time, entropy is rising until it plateaus at a maximum. An alternative scenario may be 0 entropy from beginning to end. Anyways, my original post was confusing and not 100% correct.
 
  • #18
antonima said:
I just don't buy into the 'entropy of information' idea. Sure, it is a real concept when working with digital systems. But if one for instance thinks of a scale tipping, the scale does not need any information regarding what it is holding in order to tip one way or the other. It just acts physically on forces exerted by gravity. This is an example of a system that operates without any information transfer, unless I do not wholly understand the concept.

Entropy is the fulfillment/usage/occupation of degrees of freedom. It literally is information.

A balance tips until it exhausts information (minimizes potential energy; maximizes kinetic energy), at which point it no longer works like a balance should and starts to oscillate microscopically around equilibrium. This is entropy taking its course. Whatever information you gain from the balance, the balance and the surrounding must lose. Same thing with a spring scale, in which case potential and kinetic energy keeps converting to one another. As long as the scale is working, it must release entropy one way or another. The minute it can no longer release entropy, it will get stuck, and its information will no longer be updated.

You might say, well if time were to stand still, then I still have some information. But the discussion is moot if time were to stand still, because stopping time by definition stops everything. So yeah, if time were to stop, then physics might be different.
 
Last edited:
  • #19
antonima said:
<snip>

I just don't buy into the 'entropy of information' idea. <snip>

You don't have to- science does not conform to your opinion.
 
  • #20
danmay said:
It may still be possible for entropy to decrease, but it doesn't last long. Most of the time, entropy is rising until it plateaus at a maximum.

Yes, that is correct in just about all cases. But once at the plateau there is an equal probability of momentary entropy increase and decrease.

danmay said:
Entropy is the fulfillment/usage/occupation of degrees of freedom. It literally is information.

Are degrees of freedom literally information? I do not see it that way. Degrees of freedom count the number of possible configurations of a system.. .. now to define information, it is a little bit harder.
I say infromation is a man-made concept used to denote man-made objects which store knowledge. To equate a man-made concept with a physical value, such as entropy, is surely false.

danmay said:
A balance tips until it exhausts information (minimizes potential energy; maximizes kinetic energy), at which point it no longer works like a balance should and starts to oscillate microscopically around equilibrium. This is entropy taking its course. Whatever information you gain from the balance, the balance and the surrounding must lose. Same thing with a spring scale, in which case potential and kinetic energy keeps converting to one another. As long as the scale is working, it must release entropy one way or another. The minute it can no longer release entropy, it will get stuck, and its information will no longer be updated.

Even if entropy does increase once the balance tips, no information is required for it to 'decide' which way to tip. If we have two objects of equal weight on opposite sides of the balance, it will 'know' not to tip without expending any entropy.
Andy Resnick said:
You don't have to- science does not conform to your opinion.

Entropy of information is a part of information theory, not physical science. Please take a look at the wiki:

The entropy rate of English text is between 1.0 and 1.5 bits per letter,[6] or as low as 0.6 to 1.3 bits per letter, according to estimates by Shannon based on human experiments

I REALLY don't think physicists or chemists would be measuring standard entropy based on human experiments.
I declare that there are long-lived macro scale exceptions to the second law of thermodynamics.
 
  • #21
antonima said:
Okay, I think I see what you mean. That is exactly the point, on small time and energy scales there can be reversals. It is much like a liquid and the partial pressure of it's gas phase. This pressure is only constant when evaporationa and condensation are at an equilibrium, so a lot of small scale 'fluctuations' still occur when more molecules simultaneously evaporate than condense. It just doesn't affect the macro-scale properties of the gas.
Right, the second law is very much a macroscopic law. I see that you understand this, the problem is that there are already so many people vested in violating the second law on the macro scale (with machines that create energy and so on), and so many people vested in claiming that evolution violates the second law (to fit religious convictions), that it is important to stress the second law is not a blanket statement about how things are, it applies to the time-averaged behavior of systems that are both closed and macroscopic. This shouldn't be framed as a cool person in the second law, it's just the right way to understand the second law.
Still, no one has made a machine that sucks up thermal energy and creats electricity. It seems that this fluctuation theorem would add credulence to its feasibility. Wouldn't you agree?
No, that is exactly what does not follow from the fluctuation theorem, because on scales that would be appropriate for making electricity, the fluctuation theorem has to be coupled with the law of large numbers, and when you do that, you get the second law. That's also where you get the impossibility of doing what you are suggesting. Some probabiliites are so small that they work like zero, and science is about what works.
 
Last edited:
  • #22
antonima said:
I declare that there are long-lived macro scale exceptions to the second law of thermodynamics.

We don't do perpetual motion machines here, so I declare this thread closed.
 

FAQ: Entropy Reversal: Proven by Physics and Science

What is entropy reversal?

Entropy reversal is a phenomenon in which a system, previously in a state of high disorder or entropy, becomes more ordered over time. This goes against the commonly understood concept of entropy always increasing, but has been shown to occur in certain physical and scientific processes.

How is entropy reversal proven by physics and science?

Entropy reversal has been observed and demonstrated through various experiments and mathematical models in physics and other scientific fields. These include studies of thermodynamics, quantum mechanics, and information theory, among others.

Can entropy reversal occur naturally?

Yes, there are natural processes in which entropy reversal can occur. For example, living organisms are able to maintain a high level of order and complexity, despite the tendency for entropy to increase. This is due to the constant input of energy and the ability to dissipate waste products.

Can entropy reversal be harnessed for practical use?

While the concept of entropy reversal is still being explored and understood, there are potential practical applications that could harness this phenomenon. One example is in the field of quantum computing, where entropy reversal could help improve the efficiency and speed of calculations.

Is entropy reversal a universally accepted concept?

The idea of entropy reversal is still a topic of debate and further research in the scientific community. While many studies have shown evidence of this phenomenon, there are still differing opinions and ongoing research on the extent and implications of entropy reversal.

Similar threads

Replies
19
Views
2K
Replies
4
Views
1K
Replies
1
Views
2K
Replies
6
Views
1K
Replies
2
Views
1K
Replies
28
Views
13K
Replies
1
Views
1K
Back
Top