Entropy as a measure of ignorance

In summary, the language of entropy is examined for consistency with its mathematics and physics, and for its efficacy as a guide to what entropy means. It is shown how a set of reasonable spreading properties can be used to derive the entropy function. A main conclusion is that it is appropriate to view entropy’s symbol S as shorthand for spreading.
  • #1
Monsterboy
303
96
http://qr.ae/TUpoRU

"Entropy is a measure of our ignorance about a system " Is that accurate ?
 
Last edited:
Science news on Phys.org
  • #2
"Entropy, Its Language, and Interpretation" by Harvey S. Leff (https://www.researchgate.net/publication/227218780_Entropy_Its_Language_and_Interpretation)

Abstract
The language of entropy is examined for consistency with its mathematics and physics, and for its efficacy as a guide to what entropy means. Do common descriptors such as disorder, missing information, and multiplicity help or hinder understanding? Can the language of entropy be helpful in cases where entropy is not well defined? We argue in favor of the descriptor spreading, which entails space, time, and energy in a fundamental way. This includes spreading of energy spatially during processes and temporal spreading over accessible microstates states in thermodynamic equilibrium. Various examples illustrate the value of the spreading metaphor. To provide further support for this metaphor’s utility, it is shown how a set of reasonable spreading properties can be used to derive the entropy function. A main conclusion is that it is appropriate to view entropy’s symbol S as shorthand for spreading.

"Entropy Is Simple — If We Avoid The Briar Patches!" by Frank L. Lambert (http://entropysimple.oxy.edu/content.htm#increase)

"The second law of thermodynamics says that energy of all kinds in our material world disperses or spreads out if it is not hindered from doing so. Entropy is the quantitative measure of that kind of spontaneous process: how much energy has flowed from being localized to becoming more widely spread out (at a specific temperature)."
 
  • Like
Likes Monsterboy
  • #3
I have often heard people say "entropy depends on the observer." It is one of the reasons why the bouncing universe theory cannot be completely ruled out. I remember discussing this with (late) marcus. I am unable get the thread. Is the statement inside the quote accurate ?
 
  • #4
There are several slightly different definitions of entropy. But one definition is Boltzman's:

A system of many particles in classical physics is completely described by giving a location its location in "phase space". If you give the position and momentum of every single particle, then that gives the phase space location. That's a point in 6N dimensional space if there are N particles, because you have to specify:

  1. ##x_1, y_1, z_1, p_{x1}, p_{y1}, p_{z1}##
  2. ##x_2, y_2, z_2, p_{x2}, p_{y2}, p_{z2}##
  3. etc.
where ##x_j, y_j, z_j, p_{xj}, p_{yj}, p_{zj}## are the components of the position and momentum of particle number j.

Now, if you don't know precisely what all 6N values are giving the system's location in phase space, you can quantify your ignorance by giving a "volume" in phase space, meaning that the system has a location somewhere in that volume. Boltzmann defined the entropy of a system as ##S = k log(W)## where ##k## is Boltzmann's constant, and ##W## is the volume of the system in phase space, and ##log## means natural log. The bigger that number, the more uncertain you are about the precise location of the system in phase space.

This notion of entropy is subjective, because different people might have different amounts of information about the system, and might use a different volume in phase space.
 
  • Like
Likes Monsterboy
  • #5
Monsterboy said:
I have often heard people say "entropy depends on the observer."

Why should entropy depend on the observer?

"The entropy of a substance, its entropy change from 0 K to any T, is a measure of the energy that can be dispersed within the substance at T: integration from 0 K to T ofCp/T dT (+ q/T for any phase change)." (Frank L. Lambert, "Entropy Is Simple, Qualitatively", J. Chem. Educ., 2002, 79 (10), p 1241)
 
  • #6
Lord Jestocost said:
Why should entropy depend on the observer?

"The entropy of a substance, its entropy change from 0 K to any T, is a measure of the energy that can be dispersed within the substance at T: integration from 0 K to T ofCp/T dT (+ q/T for any phase change)." (Frank L. Lambert, "Entropy Is Simple, Qualitatively", J. Chem. Educ., 2002, 79 (10), p 1241)

Well, that definition of entropy is a little circular, because ##T## is in turn defined via ##1/T = \frac{\partial S}{\partial U}|_{V}##.
 
  • #7
Entropy is linked to energy through its original definition by Clausius, dS = dQ/T, where "d" connotes a very small change.
 
  • #8
Lord Jestocost said:
Entropy is linked to energy through its original definition by Clausius, dS = dQ/T, where "d" connotes a very small change.

The question is: how is ##T## defined?

In statistical mechanics, entropy is the primary quantity, and temperature is defined in terms of how entropy changes when you add a small amount of energy.
 
  • #9
The question is: Does entropy depend on the observer?

When transfering a system from state 0 to state 1 (both characterized by a set of selected macroscopic observables), you can in principle think of any reversible process to define the entropy in state 1:

S1 = S0 +δQrev/T (integration from 0 to 1)

The "subjective" part is merely the definition of the macroscopic observables you want to keep track of for the given system (temperature, pressure, volume, number of particles etc.).
 
  • #10
Lord Jestocost said:
The question is: Does entropy depend on the observer?

When transfering a system from state 0 to state 1 (both characterized by a set of selected macroscopic observables), you can in principle think of any reversible process to define the entropy in state 1:

S1 = S0 +δQrev/T (integration from 0 to 1)

The "subjective" part is merely the definition of the macroscopic observables you want to keep track of for the given system (temperature, pressure, volume, number of particles etc.).

Point taken, but there's another issue even after you've chosen the macroscopic variables. Given macroscopic variables ##E, V, N## (total energy, volume and number of particles), there are many (infinitely many in the classical case, and astronomically many in the quantum case) microstates consistent with that macrostate. But are they all equally likely? If not, what's the probability distribution?

You can just define "equilibrium" so that equal-likelihood is part of the definition, I suppose. Then your claims about entropy are objectively true for a system in equilibrium.
 
  • #11
I agree. Nonmechanical thermodynamic variables such as temperature and entropy are combined with “mechanical“ considerations from statistical mechanics on base of the concept of thermal equilibrium.
 

FAQ: Entropy as a measure of ignorance

1. What is entropy?

Entropy is a measure of disorder or randomness in a system. It is commonly used in physics, information theory, and statistics to describe the uncertainty or lack of information about a system.

2. How is entropy related to ignorance?

Entropy can be seen as a measure of ignorance because it reflects the amount of uncertainty or lack of knowledge about a system. A high entropy value indicates a high level of ignorance, while a low entropy value indicates a high level of knowledge.

3. How is entropy calculated?

The formula for calculating entropy depends on the specific context in which it is being used. In information theory, it is calculated as the negative sum of the probabilities of all possible outcomes multiplied by their logarithms. In thermodynamics, it is calculated using the Boltzmann's entropy formula.

4. Can entropy be reduced or eliminated?

In certain contexts, entropy can be reduced or eliminated through the introduction of additional information or constraints. For example, in information theory, the use of coding techniques can reduce entropy and increase the efficiency of data transmission. However, in thermodynamics, the second law of thermodynamics states that entropy of a closed system always increases and cannot be eliminated.

5. What are some real-world applications of entropy as a measure of ignorance?

Entropy has various applications in different fields such as physics, chemistry, biology, information theory, and statistics. In physics, it is used to describe the randomness and disorder in a system. In information theory, it is used to measure the uncertainty of data. In biology, it is used to describe the complexity and diversity of ecosystems. In statistics, it is used to measure the amount of variability in a dataset.

Similar threads

Replies
4
Views
1K
Replies
3
Views
1K
Replies
3
Views
2K
Replies
3
Views
1K
Replies
3
Views
573
Replies
1
Views
987
Back
Top