Is Entropy Truly Undefined in Physical Systems?

  • B
  • Thread starter lukephysics
  • Start date
  • Tags
    Entropy
In summary, the conversation discusses the concept of entropy in different contexts, such as in physical systems and chemical reactions. The definition of entropy as a measure of uncertainty for an observer is mentioned, as well as the different treatments of entropy in thermodynamics and statistical physics. The conversation also touches on the topic of relativistic thermodynamics and its relation to entropy. Overall, the main point is that entropy is a measure of missing information in a system, relative to complete information.
  • #1
lukephysics
60
15
TL;DR Summary
why do they say things have entropy such as ‘the early universe has low entropy’ when they don’t specify who is the observer and what thing are they predicting?
I always got a bit confused when listening to podcasts about arrow of time and entropy in the universe. So I was reading more about information theory. I learned today that for physical systems entropy is not defined. All it means is how much uncertainty an observer has when making a prediction about something of particular interest to them.

So why do they say things have entropy such as ‘the early universe has low entropy’ when they don't say who is the observer and what thing are they predicting?

Another example is entropy in chemical reactions. Is that a different definition of entropy? Or is it fundamentally the same?
 
Last edited:
Physics news on Phys.org
  • #2
Can you please say (exactly quote) the specific statements that you're examining?
 
  • #3
Interesting question - maybe @vanhees71 has some insights here, being the resident expert on relativistic hydrodynamics. Planck proved in this paper that entropy is a Lorentz invariant scalar, but also regards the measured temperature as transforming as ##T \rightarrow T(1-v^2)^{-1/2}## between observers. On the other hand, it seems more natural to say that temperature is only defined in the rest frame of the body?
 
  • Like
Likes vanhees71
  • #4
The relativistic treatment of thermodynamics before van Kampen is a mess, though I'm not sure whether van Kampen is really the first who introduced our modern view. A kind of review is

N. G. van Kampen, Relativistic thermodynamics of moving
systems, Phys. Rev. 173, 295 (1968),
https://doi.org/10.1103/PhysRev.173.295.

Today we use the definition, as given in Sect. 9 of this paper, that the thermodynamic quantities are defined in the (local) restframe of the medium. Entropy is a scalar quantity. The paper also gives examples for two historical treatments by Ott and Planck.

Another approach is of course statistical physics. There the key is that the phase-space distribution function is a scalar quantity. For a manifestly covariant treatment of elementary relativistic transport theory, see

https://itp.uni-frankfurt.de/~hees/publ/kolkata.pdf

Concerning the more general questions of the OP, it's clear that entropy in the information theoretical sense (and that seems to be the best and most comprehensive approach we have) is always a measure for the missing information (more intuitively, it's a measure for the "surprise" you have from a specific outcome of a random experiment), given some information about the system, relative to the case of complete information. E.g., in quantum statistical physics the entropy is always relative to the preparation of a pure state, i.e., any pure state has entropy 0, which leads to the von Neumann-Shannon-Jaynes definition of entropy,
$$S=-k_{\text{B}} \mathrm{Tr} (\hat{\rho} \ln \hat{\rho} ),$$
where ##\hat{\rho}## is the statistical operator, describing the state of the system.
 
Last edited:
  • Like
Likes ergospherical

FAQ: Is Entropy Truly Undefined in Physical Systems?

What is entropy?

Entropy is a scientific concept that describes the level of disorder or randomness in a system. It is often used to explain the direction of energy flow and the tendency of systems to become more disordered over time.

Is there really no such thing as entropy?

No, there is definitely such a thing as entropy. It is a fundamental concept in thermodynamics and has been observed and studied by scientists for centuries.

How does entropy relate to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that the disorder or randomness in a system will always tend to increase, and it is directly related to the concept of entropy.

Can entropy be reversed?

While it is possible to decrease the entropy of a specific system, the overall entropy of a closed system will always increase. This is due to the second law of thermodynamics, which states that entropy can never decrease in an isolated system.

How do scientists measure entropy?

Entropy is typically measured in units of joules per kelvin, and can be calculated using the change in energy and temperature of a system. Other methods, such as statistical mechanics, can also be used to calculate entropy in more complex systems.

Similar threads

Back
Top