Shannon Entropy vs Entropy in chemistry

In summary, entropy in programming and science have the same formula, but differ in interpretation. In information theory, it measures the unpredictability of new information given initial probabilities, while in science it has a physical meaning and is used to define temperature. In both cases, entropy describes the spread of probability distributions. Shannon entropy, named after scientific entropy, is measured in bits and is lower for more predictable events. In biology, life produces low entropy states through evolution, creating order despite the natural tendency towards disorder.
  • #1
despues357
I'm wondering about the exact usage of the term entropy in programming vs. entropy in science.
 
Technology news on Phys.org
  • #2
This is a common question. The information-theory (Programming) and Von Neumann (Science) entropy have essentially the same formula
$$S \propto -\sum_i p_i \ln p_i $$
however there is difference in interpretation. In information theory, the entropy represents how much new information you get when a specific outcome happens, given the initial probability of each outcome. This doesn't exactly apply to the scientific entropy, because it's not particularly meaningful to ask "if I have a probability distribution for micro-states, and then learn the system is definitely in a particular micro-state, then how much Shannon information has been acquired?"

Instead we have a physical meaning for entropy. For example it's used to define temperature.
$$ \frac{1}{T}= \left(\frac{dS}{dE}\right)_{N,V} $$
or
$$ dS = \frac{dQ}{T} $$

In thermal equilibrium, the entropy is maximized given some set of constraints. For example in the Fermi or Bose gases, we determine the probability that a state is occupied by maximizing the entropy given a particular fixed average energy and number of particles.

More generally I suppose entropy can both be a way of describing how spread out the probability distributions are.
 
Last edited:
  • Like
Likes Telemachus
  • #3
It's becoming clearer now,

wikipedia has an explanation that talks about the predictability of a string of questions, being a unit of per question. Their formulas have the same relationships between the ideas. I guess there isn't much of a relationship between what they stand for as I thought there would be.

The 1st paragraph describes how entropy is measured as unpredictability on average is a function of what is available after whatever number of questions are answered out of the range that is possible given the problem. The second paragraph describes it as the diminishing average of unpredictability over answering more and more of the subsequent questions within the framework of the problem. So, what I'll pull from this is that Shannon entropy is named after scientific entropy due to the form of both ideas and obviously not their individual constituent ideas. Maybe you could describe it in terms of scientific entropy, where the number of micro-states unknown vs the known amount is diminishing within a particular system as you begin to answer what each individual micro-state is given the total possible range of possible values...you know this reminds me of when I learned about degrees of freedom in statistics.

...

wikipedia:

"Now consider the example of a coin toss. Assuming the probability of heads is the same as the probability of tails, then the entropy of the coin toss is as high as it could be. This is because there is no way to predict the outcome of the coin toss ahead of time: if we have to choose, the best we can do is predict that the coin will come up heads, and this prediction will be correct with probability 1/2. Such a coin toss has one bit of entropy since there are two possible outcomes that occur with equal probability, and learning the actual outcome contains one bit of information. In contrast, a coin toss using a coin that has two heads and no tails has zero entropy since the coin will always come up heads, and the outcome can be predicted perfectly. Analogously, one binary-outcome with equiprobable values has a Shannon entropy of {\displaystyle \log _{2}2=1}
3a63d03d23e9d47761a4f1cfa1d0097919c395f1
bit. Similarly, one trit with equiprobable values contains {\displaystyle \log _{2}3}
f889b70067a012339b8baa7f0f9d17a0e6889c8f
(about 1.58496) bits of information because it can have one of three values.

English text, treated as a string of characters, has fairly low entropy, i.e., is fairly predictable. Even if we do not know exactly what is going to come next, we can be fairly certain that, for example, 'e' will be far more common than 'z', that the combination 'qu' will be much more common than any other combination with a 'q' in it, and that the combination 'th' will be more common than 'z', 'q', or 'qu'. After the first few letters one can often guess the rest of the word. English text has between 0.6 and 1.3 bits of entropy per character of the message."
 
  • #4
One interesting area of overlap is in biology. Shannon information in an event is the log reciprocal of probability of event, which means high information events are less probable. So for instance odds of seeing 0 or 1 (-log2(1/2) = 1 bit) is more likely then seeing a specific million bit number. (1 in 2^1000000 odds). In physical entropy, low entropy states are possible, but its improbable for a system to return to them. Even in open systems, with energy pouring in like earth, its more likely to find things in fairly high entropy states. The exception everyone points to when they hear physical entropy explained as disorder is life, which superficially seems to violate it by creating order. Life, through evolution, is in the business of producing information via natural selection and mutation in the DNA, which is by definition of information improbable, but then it makes the improbable probable through reproduction: evolution discovers the best forms, then copies them. In so doing, it "Climbs Mount Improbable" in the words of biologist Richard Dawkins. These improbable forms correspond, interestingly, to the production of low entropy states of matter, such as free oxygen and sugars as produced by plants in photosynthesis, the energy of which we harvest to live, and create our own order. Life doesn't reverse or violate physical entropy (if the sun went black we'd all die) but in the concept of an open system like our earth, it provides a counter force to what you'd expect due to information.
 

FAQ: Shannon Entropy vs Entropy in chemistry

What is the difference between Shannon Entropy and Entropy in chemistry?

Shannon Entropy, also known as information entropy, is a measure of the uncertainty or randomness in a system or message. It is commonly used in the field of information theory. On the other hand, entropy in chemistry is a measure of the disorder or randomness in a thermodynamic system. It is used to describe the spontaneity of a chemical reaction.

How is Shannon Entropy calculated?

Shannon Entropy is calculated using the formula H = -Σpilog(pi), where pi represents the probability of a certain event or symbol occurring in a message. It is measured in bits, nats, or bans depending on the base of the logarithm used.

What is the unit of measurement for entropy in chemistry?

The unit of measurement for entropy in chemistry is joules per kelvin (J/K). This unit represents the change in entropy observed in a system when the temperature is changed by 1 Kelvin.

Can Shannon Entropy be used in chemistry?

Yes, Shannon Entropy can be used in chemistry to measure the uncertainty or randomness of a chemical reaction. It can also be used to analyze the information content of a chemical compound or reaction.

How is entropy related to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that as a system becomes more disordered, its entropy will increase. Entropy in chemistry is closely related to this law as it measures the disorder or randomness in a system.

Similar threads

Replies
9
Views
3K
Replies
4
Views
2K
Replies
1
Views
842
Replies
6
Views
1K
Replies
5
Views
1K
Replies
4
Views
944
Replies
1
Views
945
Replies
2
Views
1K
Back
Top