Is there a generalized second law of thermodynamics?

In summary, there are different kinds of entropies, including the free entropy discovered by the author Voiculescu. The second law of thermodynamics states that total entropy cannot decrease with time. However, it is unclear if this applies to all entropies and if there is a single "time" for them. The author also wrote about Alain Connes factors, and it is unknown if his entropy follows the second law for the one parameter group Connes discovered. Shannon's and thermodynamic entropies have similar mathematical formulas, but they have different meanings and relationships with time. In thermodynamics, entropy is a measure of the number of possible ways a system can exist at the molecular level and still exhibit the same macroscopic state
  • #1
Heidi
418
40
Hi Pfs,
There are different kinds of entropies.
I discoved the free entropy.
https://arxiv.org/pdf/math/0304341.pdf
the second law says that the total entropy cannot decrease when time goes by.
Is it always the same "time" for the different entropies?
the author, Voiculescu, wrote articles about Alain Connes factors
Does his entropy obey a second law of thermodynamics for the one parameter group Connes discovered?
 
Physics news on Phys.org
  • #2
Shannon's "entropy" and thermodynamic "entropy" are mathematically similar but they have quite different meanings and very different relationships with time.

In thermodynamics, one is just comparing different equilibrium states. The time lapse between them is not material so long as one follows from the other as the result of some process. What is important in comparing two states is the order in which they occurred. The second law, which in the real world requires entropy to increase overall in successive states of the universe, implies a single direction for time ie. time flows in the direction of increasing entropy of the universe.

In Shannon entropy, messages contain information about something that existed or has occurred. Entropy is a measure of the usefulness of the information contained in a message. But there is no requirement that entropy must increase with successive messages.

It is important to distinguish thermodynamic entropy from information entropy. They are completely different in nature.
  • In thermodynamics, entropy is a statistical concept applying to large numbers of molecules in thermodynamic equilibrium. It is a measure of the number of possible ways for a system to exist at the molecular level and still exhibit the same macroscopic state. The concept of equilibrium does not apply to a single atom or molecule or even a small number of molecules. It is meaningless to talk about the entropy of a molecule, for example
  • Shannon entropy deals with information involving countable units of data. In information theory, entropy is a measure of the information content of a message. A message communicating the result of a single coin flip contains less information than a message communicating the results of 10 coin flips. So the latter message is assigned a higher entropy value. If a two headed coin is flipped, we know that it will come up heads so the result of such a flip conveys no information: entropy=0.

As far as the superficial mathematical similarities are concerned:
  • Shannon entropy varies as the logarithm of the number of possible ways to compose the message to convey different information. So if you send a message conveying the results of 10 head/tail coin flips, there are 210=1024 possible ways that message could be configured to convey different information. So such a message has an entropy that is 10 times greater than a message conveying the results of one flip (21 possibilities).
  • In statistical thermodynamics, entropy varies as the logarithm of the number of possible states (position and velocity) of the molecules of a system that would result in the same macroscopic equilibrium state for that system.

AM
 
  • Like
Likes dRic2 and Delta2
  • #3
The Landauer's principle relates those two kinds of entropies. According to this principle, when you erase information entropy, you produce thermodynamic entropy by an amount equal or larger than the erased information entropy.
 
  • Like
  • Informative
Likes dRic2, protonsarecool and vanhees71
  • #4
Demystifier said:
The Landauer's principle relates those two kinds of entropies. According to this principle, when you erase information entropy, you produce thermodynamic entropy by an amount equal or larger than the erased information entropy.

As I understand the principle, it says that regardless of how small the physical means of carrying information, changing the smallest amount of information will still require a non-zero expenditure of energy. This will result in some increase in thermodynamic entropy. But this is more about thermodynamics than information theory. I really don't see how it relates Shannon entropy.

Feynman, in his Lectures ( no. 46), alludes to a similar principle. He suggests using a microscopic ratchet and pawl mechanism to select only molecules with high energy to pass through an aperture in a partition and separate a gas into hot and cold compartments without using energy and, thereby, avoid the second law of thermodynamics. But then he shows that without an expenditure of a minimum amount of energy in the form of mechanical work, the ratchet and pawl would not function. As a result, there would be no separation of the faster and slower molecules. Second Law prevails.

AM
 
  • Like
Likes Delta2
  • #5
Andrew Mason said:
As I understand the principle, it says that regardless of how small the physical means of carrying information, changing the smallest amount of information will still require a non-zero expenditure of energy.
Erasing, not necessarily changing. Reversible logical gates don't necessarily need to produce thermodynamic entropy.

Andrew Mason said:
This will result in some increase in thermodynamic entropy. But this is more about thermodynamics than information theory. I really don't see how it relates Shannon entropy.
It's more about thermodynamic entropy, I agree, but it doesn't mean that it's not about Shannon entropy at all. Physical carrier of Shannon entropy is related to abstract Shannon entropy, just as physical computer is related to abstract computing.
 
  • #6
The information content in physics, is presumable encoded in the distinguishable microstates of matter, it's just a generalization of binary code. But in the general case, the information can even be encoded in several different microstructures(or probability spaces), not just one. Because the microstate of an obserer, can of course encode also information about motion. Different observers may not agree upon what is distinguishable to start with, which leads to be well known issues of unitarity in QM and QG foundations. These spaces may not even commute or be independent. If you even add not just stationary states, but also states of motion, then the entropic principles effectively is elevate to a kind of action principles. As the "entropy measure" is generalised to an "action measure" or information divergence/relative entropy, like the kullback leibler divergence. It's given different names. But the abstractions are IMO quite similar and have common roots. These things is what comes to my mind if one starts to aske for generalized second laws, and wether "time" (arrow of time?) is the same for different intrinsic statistical flows?

I think entropy is as a relative notion as is probability, but it depends on the perspective and "interpretation" of probability as well. Especially in the foundational context of QM where one wants to include gravity. The only way "entropy" is special is that it's just a log measure of probability, to make multiplicative combinations appear additive. All other philosophical questions about entropy, are the same as with probabilities, or "transition probabilities" or "actions".

/Fredrik
 
  • Like
Likes Delta2
  • #7
I have a different question about the entropy and the second law
the entropy cannot decreas if is greater or equal a time t>0 than what it was a t=0.
Would a CPT observer have also a second law in its reversed time?
 
  • #8
Heidi said:
I have a different question about the entropy and the second law
the entropy cannot decreas if is greater or equal a time t>0 than what it was a t=0.
Would a CPT observer have also a second law in its reversed time?
CPT invariance is a property of the microscopic laws. The second law, on the other hand, is only a macroscopic law. At the microscopic level, it is not valid. In fact, from a microscopic point of view, the send "law" is not a law but a property of a specific solution (of the equations of motion) in which our universe happens to be. The second "law" is in fact a consequence of special initial conditions, namely small initial entropy. So if you apply the CPT transformation to this solution, you get another solution in which entropy decreases with time.
 
  • #9
This detailed account looks relevant -
https://quantum-journal.org/papers/q-2021-08-09-520/

The arrow of time in operational formulations of quantum theory
“The operational formulations of quantum theory are drastically time oriented. However, to the best of our knowledge, microscopic physics is time-symmetric. We address this tension by showing that the asymmetry of the operational formulations does not reflect a fundamental time-orientation of physics…”
 
  • Like
Likes vanhees71
  • #10
I think there is also the complementary perspective to Rovellis to turn the argument around, that the time-symmetric physics may be the result of the inferential perspective of collecting statistics of repeats of time subatomic processes from a massive labframe. It is not be a conincidence that such inference scheme produces timless laws, as that is implicitly what you seek when doing statistics. But wether these inferred frozen laws are the true fundamentals is not obvious I think, and the more you introduce gravity and cosmological models, the less obvious does it get IMHO.

/Fredrik
 
  • Like
Likes Delta2
  • #11
I am not sure to understand these lines in the Rovelli's paper:
Decoherence requires information loss and an increase in entropy. Hence RQM is a time-symmetric formulation of quantum theory, but the dynamics of relative facts is time symmetric while the dynamics of stable facts is time oriented.

Does this time symmetry in the decoherence process implies that entropy increases in the two time symmetric measurements or processes?
 
  • #12
In thermodynamics, the arrow of time has to do with changes from one equilibrium state (of a closed isolated system) to another . The equilibrium state is time symmetric itself. That is to say that a change from one particular microstate of an equilibrium state to another microstate of that same equilibrium state is time symmetric - there is no requirement that one follow the other. Similarly, changes between quantum states are time symmetric. It is only when the quantum states decohere into a macroscopic state that time reversal is not possible.

AM
 
  • Like
Likes *now* and DrChinese
  • #13
Rovelli writes:
the dynamics of stable facts is time oriented.
could give examples of such stable facts?
 
  • #14
if I understand it, examples of stable facts can be systems that may be manipulated and measured macroscopically.
 
  • #15
I came across a more recent paper that might update some finer details -

“Information is Physical: Cross-Perspective Links in Relational Quantum Mechanics“

https://arxiv.org/pdf/2203.13342.pdf
 
  • #16
Further, as an inference was drawn (in some alternative case), this paper considers possible such inferences (although this may not apply in the alternative case),

https://arxiv.org/abs/1910.02474

Neither Presentism nor Eternalism​

Carlo Rovelli
Is reality three-dimensional and becoming real (Presentism), or is reality four-dimensional and becoming illusory (Eternalism)? Both options raise difficulties. I argue that we do not need to be trapped by this dilemma. There is a third possibility: reality has a more complex temporal structure than either of these two naive options. Fundamental becoming is real, but local and unoriented. A notion of present is well defined, but only locally and in the context of approximations.
 

FAQ: Is there a generalized second law of thermodynamics?

What is the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time, or remain constant in ideal cases where the system is in a steady state or undergoing a reversible process.

What is the generalized second law of thermodynamics?

The generalized second law of thermodynamics is an extension of the second law that includes the effects of gravity and the formation of black holes. It states that the total entropy of a closed system, including any black holes, will always increase over time.

Is the generalized second law of thermodynamics universally accepted?

The generalized second law of thermodynamics is widely accepted by the scientific community, but there is ongoing research and debate about its implications and possible exceptions.

What evidence supports the generalized second law of thermodynamics?

Observations of the behavior of black holes and the expansion of the universe provide strong evidence for the validity of the generalized second law of thermodynamics. Additionally, many experiments and calculations in thermodynamics and cosmology support its predictions.

Are there any exceptions to the generalized second law of thermodynamics?

While the generalized second law of thermodynamics is generally accepted, there are some proposed scenarios where it may not hold true, such as in certain quantum systems or in the early universe. However, these exceptions are still being studied and debated, and the law remains a fundamental principle in our understanding of the universe.

Similar threads

Replies
9
Views
2K
Replies
2
Views
1K
Replies
3
Views
2K
Replies
4
Views
2K
Replies
3
Views
1K
Back
Top