Refuting the idea of entropy equalling "disorder"

In summary, entropy is related to how random the bit stream is and can be measured using either the Shannon entropy or Kolmogorov complexity. It is also related to the thermodynamic state variables of volume, energy, and particle number, but we don't usually measure them.
  • #1
sshai45
86
1
Hi.

I have heard this, that entropy is often called "disorder" but _isn't really so_. And I am even more puzzled by the connection and difference between entropy from "information theory" pov and from "thermodynamic" pov. I see stuff like this:

http://arstechnica.com/civis/viewtopic.php?f=2&t=3122

see the posts of "kmellis" and he simultaneously says that entropy in thermodynamics is NOT informational disorder, while simultaneously advocating an "information theoretic" basis for physics. How the f--- do you do that and then since thermo. entropy != inform. entropy, how do they relate in such a framework or not relate?

I'm curious. How can one prove mathematically that the entropy change in, say, converting an "ordered" stack of identical objects -- a more simplified version of an often given and apparently invalid example -- to a "messy" looking pile, is identically zero if you assume all other variables (temperature, etc.) are mathematically constant, so there is literally nothing but the rearrangement going on? I know, this is highly idealized, but that's the point, to isolate the "disordering" in a common man's sense and show that it has absolutely zero effect on the entropy of the whole system of objects. What I am wondering about is why couldn't there be some immeasurably small but not zero entropy change because after all you are rearranging the matter in the system, just not by a very great degree when you think of things on a "microscopic" scale?
 
Science news on Phys.org
  • #2
Research macrostates and microstates. The macrostate interpretation of entropy does not include disorder, while the microstate interpretation does via the Boltzmann equation. Both interpretations are equally correct and important, each applying to their respective part of thermodynamics (i.e. macrostate thermodynamics and microstate thermodynamics).
 
  • #3
sshai45 said:
Hi.

I have heard this, that entropy is often called "disorder" but _isn't really so_. And I am even more puzzled by the connection and difference between entropy from "information theory" pov and from "thermodynamic" pov. <snip>

Part of the difficulty is that there are multiple ways to define the information content of a system. One way, Shannon entropy, is more suited to communications- how much information is required to digitally communicate (possibly including 'measure') the microstate of a system. In this sense, entropy is related to how random the bit stream is- if you can predict the value of an incoming bit before receiving it, the entropy associated with that bit is zero. Thus, in the Shannon context, 'negentropy' is usually more useful than 'entropy'. Data compression is especially well-suited to this context.

Another way to define information content is how much information is needed to create the state, the Kolmogorov complexity. In this context, one can pose questions about how much information is required to build a factory that makes things (including other factories). Assigning numbers to the Kolmorgorov complexity is not trivial, AFAIK.

There are other measures of information, but I am only familiar with those two. Does this help?
 
  • #4
I guess it depends on how fundamental you want to go. If you are talking about the entropy of a gas or an engine, then you can use a chemistry definition and it's all very objective and we can all agree on what the entropy is. And this is consistent with the laws of thermodynamics. But then you get into statistical mechanics and macrostates and it all sounds very subjective, since a macrostate seems to depend on how much information you have. (I am grouping the statistical mechanics definition and information theoretical definition together, since they seem to be compatible to me.)

For familiar systems like ideal gases, we can use the convention that a macrostate refers to a set of states with known volume, energy, and particle number. And then entropy is just a state variable, S(V,U,N), and the two notions agree. But if we don't know the volume, does that mean we don't know the entropy? Or does it mean the entropy is greater (because we are referring to a larger set of states with known energy and particle number)?

Practically speaking, we don't measure the energy. We measure the temperature and infer the energy. But temperature is also defined in terms of the entropy.

It seems like the chemistry definition is more useful, but the information theoretical definition is more extensible to more exotic systems which have different state variables than volume, energy, and particle number. For example, in a big bang nucleosynthesis experiment, particle number might not be a useful state variable. When we report a value for entropy for an exotic system, we have to define what we consider a macrostate to be.
 

FAQ: Refuting the idea of entropy equalling "disorder"

What is entropy and how is it related to disorder?

Entropy is a measure of the randomness or disorder within a system. It is often associated with the Second Law of Thermodynamics, which states that the total entropy of a closed system will always increase over time. This means that as energy is transferred and transformed within a system, the system will become more disordered.

Can entropy ever decrease or be reversed?

In a closed system, entropy will always increase. However, in an open system where energy can be exchanged with the surroundings, it is possible for local decreases in entropy to occur. This is because energy can be used to decrease disorder in one area, but it will still result in an overall increase in entropy within the entire system.

How does the concept of entropy apply to living organisms?

Living organisms are open systems that constantly exchange energy and matter with their surroundings. Entropy within living organisms can be decreased in small localized areas, such as when cells use energy to maintain order and carry out specific functions. However, the overall entropy of the organism and its surroundings will still increase over time.

Is entropy the same as disorder?

No, while entropy is often associated with disorder, it is not the same thing. Entropy is a measure of the number of possible microscopic arrangements within a system, while disorder is a subjective concept that refers to the lack of organization or predictability. In some cases, an increase in entropy can actually lead to a more ordered state, such as when gas molecules spread out to fill a larger volume.

Can entropy be measured and quantified?

Yes, entropy can be calculated and quantified using mathematical equations. The unit for entropy is joules per Kelvin (J/K), and its value is dependent on the number of possible microscopic states within a system. However, the concept of entropy is complex and there is ongoing debate and research about how it can be accurately measured and applied in different systems.

Similar threads

Replies
25
Views
3K
Replies
23
Views
3K
Replies
2
Views
3K
Replies
2
Views
11K
Replies
1
Views
3K
Replies
1
Views
3K
Replies
6
Views
4K
Back
Top