If entropy increases, where does the information come from?

In summary: Now suppose that the book is open, and you know that there is exactly one letter "A" written somewhere on one of the pages. How much information does it contain now?The answer is "zero". If you know that the letter is there, but you don't know where it is exactly, then the information was not decreased or increased - it remains the same as in the closed book case.But suppose that the letter "A" is not alone. Suppose that on the same page there is another letter "B". Now, how much information does the book contain?The answer is "zero" again. You know that there are two letters, but you don't know their order. The information was not decreased or
  • #1
Wolfenstein3d
21
0
For ex. if two particles close to each other require n bits of info to describe them, why does it take n bits to describe them when they are far apart? Shouldn't the information content be the same for the macrosystem?
 
Science news on Phys.org
  • #2
Wolfenstein3d said:
if two particles close to each other require n bits of info to describe them, why does it take n bits to describe them when they are far apart?

Can you give a specific example?
 
  • #3
You ask for a specific example a lot... What do you not understand? I laid the question out pretty concretely
 
  • #4
Like entropy increases if two particles spread out. Why does that mean information has increased when the positions (xyz) only take 3 coordinates in both former and latter cases?
 
  • #5
Wolfenstein3d said:
What do you not understand? I laid the question out pretty concretely

Your question is not as concrete as you believe it to be. You are being coached to fill in the blanks.

Google the definition of entropy, digest how one calculates a specific number for entropy of a specific system, and see if that helps you to fill in more of the blanks.
 
  • #6
My question is, since Shannon entropy is considered information, where does this new information come from?

[Mentors's note: some unnecessary and off-topic argumentation has been removed from this post]
 
Last edited by a moderator:
  • #7
Wolfenstein3d said:
I laid the question out pretty concretely

You may think you did, but you didn't. And I strongly suspect that is because you do not have a good understanding of the topic you're asking about.

Wolfenstein3d said:
since Shannon entropy is considered information, where does this new information come from?

What do you think the Shannon entropy is of a system of two particles close together? What do you think the Shannon entropy is of a system of two particles far apart? Please show your work.
 
  • #8
Wolfenstein3d said:
My question is, since Shannon entropy is considered information, where does this new information come from?
I am still not clear on why you are saying that the Shannon entropy of a system with two nearby particles is different from the Shannon entropy when they are more widely separated. Can you show the calculation of the entropy under both conditions?
 
  • #9
If a gas expands it's entropy increases. So how would a simplification of two gas particles not follow the same entropy increase?
 
  • #10
Wolfenstein3d said:
If a gas expands it's entropy increases. So how would a simplification of two gas particles not follow the same entropy increase?

Because a system of two particles is not a gas.
 
  • #11
Wolfenstein3d said:
If a gas expands it's entropy increases. So how would a simplification of two gas particles not follow the same entropy increase?
Entropy is not a property of particles. Entropy is a property of a gas as a whole, when its detailed particle content is ignored. More generally, entropy is something what you can say about a big system when you don't know all the details of the small constituents comprising the big system. If physicists were superbeings who knew all the details about everything, then they would not need the concept of entropy. Metaphorically, if particles are made by God, then entropy is made by men.

That being said, now the answer to your question is easy. Two particles do not have an entropy in the way a gas has because, even if you can know every detail about 2 particles, you cannot know every detail about ##10^{23}## particles.

For more details, I highly recommend
https://www.amazon.com/dp/9813100125/?tag=pfamazon01-20
 
Last edited:
  • Like
Likes BillTre
  • #12
Wolfenstein3d said:
So how would a simplification of two gas particles not follow the same entropy increase?

Consider a one dimensional two particle bounded system (not a gas) that is at equilibrium that has 8 slots, each particle is a "1".

State 1: 00000011
State 2: 10000001

Do you consider State 1 having a different entropy than state 2, State 1 vs State 2 not changing the equilibrium of the system?

I hope I have properly posed the example.

Now let the system change and come to a new equilibrium.

State 3: 010000001

I added one slot, the 'volume' if you will of my 2 particle system is larger.

Do you consider State 3 having a different entropy than states 1 or 2?
 
  • #13
Wolfenstein3d said:
My question is, since Shannon entropy is considered information, where does this new information come from?
Suppose that someone gives you a closed book and tells you that this book contains only one letter, say letter "A", written at one of the pages in the book. All the other pages are empty. How much information that book contains?

At first sight, not much. However, it really contains more information than it looks at first. The letter "A" is written at some definite place of some definite page, and the information about the exact place of the letter - is an information too. If the book contains ##N## pages, then Shannon information about the page on which the letter is written is ##{\rm ln} N##. So the bigger the book the more information it contains, even when it contains only one letter.

Expansion of a gas with a fixed number of particles is similar to an "expansion" of a book (that is, increasing the number of pages) with a fixed number of letters.
 
  • Like
Likes BillTre
  • #14
So if you have 2 hydrogen atoms bouncing around in a otherwize empty box, at a certain point when adding more H atoms the system becomes a gas?
What is the cutoff point mr Adonis? Sounds like a massive paradox to me.
 
  • #15
Demystifier said:
Entropy is not a property of particles. Entropy is a property of a gas as a whole, when its detailed particle content is ignored. More generally, entropy is something what you can say about a big system when you don't know all the details of the small constituents comprising the big system. If physicists were superbeings who knew all the details about everything, then they would not need the concept of entropy. Metaphorically, if particles are made by God, then entropy is made by men.

That being said, now the answer to your question is easy. Two particles do not have an entropy in the way a gas has because, even if you can know every detail about 2 particles, you cannot know every detail about ##10^{23}## particles.

For more details, I highly recommend
https://www.amazon.com/dp/9813100125/?tag=pfamazon01-20

Again, why is there a cutoff in not being able to know about n particles. Also, you should be able to know the exact same amount about each particle in a 10^23 mass of particles as you can know about each particle in a 2 particle mass. HUP doesn't care how many particles you are looking at. And you can't know everything about a particle. Again bc of hup.
 
  • #16
Wolfenstein3d said:
So if you have 2 hydrogen atoms bouncing around in a otherwize empty box, at a certain point when adding more H atoms the system becomes a gas?

There is no hard cutoff (see below), but the extremes are certainly easily distinguished. A box of 1 liter of hydrogen gas at room temperature has about ##10^{22}## atoms. Your intuitions about how gases work are based on collections of that many atoms or more. Claiming that things should work exactly the same for a system of 2 atoms shows a huge failure to understand the issue.

Wolfenstein3d said:
What is the cutoff point mr Adonis? Sounds like a massive paradox to me.

Not a paradox, but a failure to realize that the term "gas", like most terms, does not have crisp, precise boundaries.

https://en.wikipedia.org/wiki/Sorites_paradox

To use the example given in that article, two grains of sand does not make a heap, but ##10^{22}## grains certainly would. Yet there is no precise number of grains of sand where you transition between "heap" and "not heap".
 
  • Like
Likes Demystifier
  • #17
Demystifier said:
Suppose that someone gives you a closed book and tells you that this book contains only one letter, say letter "A", written at one of the pages in the book. All the other pages are empty. How much information that book contains?

At first sight, not much. However, it really contains more information than it looks at first. The letter "A" is written at some definite place of some definite page, and the information about the exact place of the letter - is an information too. If the book contains ##N## pages, then Shannon information about the page on which the letter is written is ##{\rm ln} N##. So the bigger the book the more information it contains, even when it contains only one letter.

Expansion of a gas with a fixed number of particles is similar to an "expansion" of a book (that is, increasing the number of pages) with a fixed number of letters.

So is there a slight disconnect between the concept of a systems information content and its entropy? What i don't get is that if a system gains information it should take more to describe it. But it seems like apart from each particles intrinsic values, each particle could be described by an xyz position that doesn't need more info to describe.

I have read a college professors post that negative entanglement entropy cancels out the increase in standard entropy which keeps information content the same.

Is information in the sense of expanding gas actually growing?
I think my concept of information might need a different definition than shannon entropy.

Leonard suskind believed that the idea of information being destroyed was an abomination. Likewise, I am sure he would say the creation of information from nothing is an equal abomination because entroby decreases are feasible (although improbable) and an entropy decrease means information distruction if shannon entropy truly means information.
T
 
  • #18
Wolfenstein3d said:
each particle could be described by an xyz position

This is a classical model, not a quantum model. In a quantum model, the particle is described by a state vector in a Hilbert space; the x, y, and z components of position are parameters that pick out which particular state vector it is. But the Hilbert space itself is not the 3-dimensional space of the x, y, z position vector.

Wolfenstein3d said:
Leonard suskind believed that the idea of information being destroyed was an abomination.

This is a somewhat different sense of the word "information" from the one you're using when you ask about the relationship between information and entropy. When Susskind talks about information not being destroyed, he is referring to quantum unitary evolution; basically he is claiming that unitary evolution can never be violated. But that just means that, as far as the quantum state of an entire system is concerned, its evolution is deterministic: if you know the state at one instant, you know it for all time. And if you know the system's exact state for all time, its entropy is always zero, by definition.

However, if the entire system contains multiple subsystems (such as multiple particles), then it might be possible to assign a nonzero entropy to the individual subsystems, because the subsystems might not have definite states due to entanglement. This is the sort of case the professor you mentioned was talking about. For example, suppose we have a two-electron system in the singlet spin state (i.e., total spin zero); for simplicity we'll ignore their positions (if it matters, consider them to be in some bound state like an atomic orbital with no transition possible). The total entropy of this system is zero, because we know its exact state. But each individual electron has a nonzero positive entropy, because it doesn't have a definite state; its spin could turn out to be in any direction when measured. However, there is also a negative entropy due to entanglement; this is because the electron spins must be opposite, so once we have measured one electron, we know the directions of both electrons' spins. So the total entropy is still zero for the system as a whole.
 
  • #19
Wolfenstein3d said:
each particle could be described by an xyz position that doesn't need more info to describe.

If one doubles pages in @Demystifier 's book then there are twice as many possible configurations for the book to be in. The letter is still only on a single page, but it now requires more information to say which page because there are twice as many of them to consider.

Two particles being close or far does not necessarily change system entropy; the two microstates (particles close and particles farther apart) might both yield the same macro level thermodynamic state for the system.
 
  • #20
Grinkle said:
the two microstates (particles close and particles farther apart) might both yield the same macro level thermodynamic state for the system

It's not so much the macro level thermodynamic state (a system of two particles isn't usefully viewed using the thermodynamic approximation) as the fact that the set of possible two-particle position states is the same whether the particles are close together or far apart. So the "number of pages in the book" stays the same, all that changes is which of the "pages" each "letter" (particle) is on.
 
  • Like
Likes Grinkle
  • #21
PeterDonis said:
This is a classical model, not a quantum model. In a quantum model, the particle is described by a state vector in a Hilbert space; the x, y, and z components of position are parameters that pick out which particular state vector it is. But the Hilbert space itself is not the 3-dimensional space of the x, y, z position vector.
This is a somewhat different sense of the word "information" from the one you're using when you ask about the relationship between information and entropy. When Susskind talks about information not being destroyed, he is referring to quantum unitary evolution; basically he is claiming that unitary evolution can never be violated. But that just means that, as far as the quantum state of an entire system is concerned, its evolution is deterministic: if you know the state at one instant, you know it for all time. And if you know the system's exact state for all time, its entropy is always zero, by definition.

However, if the entire system contains multiple subsystems (such as multiple particles), then it might be possible to assign a nonzero entropy to the individual subsystems, because the subsystems might not have definite states due to entanglement. This is the sort of case the professor you mentioned was talking about. For example, suppose we have a two-electron system in the singlet spin state (i.e., total spin zero); for simplicity we'll ignore their positions (if it matters, consider them to be in some bound state like an atomic orbital with no transition possible). The total entropy of this system is zero, because we know its exact state. But each individual electron has a nonzero positive entropy, because it doesn't have a definite state; its spin could turn out to be in any direction when measured. However, there is also a negative entropy due to entanglement; this is because the electron spins must be opposite, so once we have measured one electron, we know the directions of both electrons' spins. So the total entropy is still zero for the system as a whole.
Interesting. Could you elaborate a bit more about why suskind called it information?
 
  • #22
Wolfenstein3d said:
Could you elaborate a bit more about why suskind called it information?

The main context in which I've seen him talk about quantum "information" not being created or destroyed is the black hole information paradox. I would assume that he used the term "information" because that's how that paradox is standardly described. I don't think he intended for it to mean exactly the same thing as Shannon information; more generally, I don't think he intended "information" to be a precise technical term in this context, just a generally descriptive term.
 
  • #23
Wolfenstein3d said:
Interesting. Could you elaborate a bit more about why suskind called it information?
We'd have to ask him for a definitive answer... but it sure looks as if he needed a word, so he chose the word in the English language that came closest to what he meant.

It's not very close and other people have chosen the same word for other concepts, but that's not Susskind's fault. The problem isn't Susskind, it is that the English language isn't up to the task - which is why we use mathematical definitions instead of English when we want to be clear.
 
  • #24
The entropy of the two particle system comes from its density matrix:

S = -\rho ln(\rho)

where \rho is the density matrix describing the two particles and S is the information analogous to the shannon information.
 
  • #25
PeterDonis said:
There is no hard cutoff (see below), but the extremes are certainly easily distinguished...

Not a paradox, but a failure to realize that the term "gas", like most terms, does not have crisp, precise boundaries.

https://en.wikipedia.org/wiki/Sorites_paradox

To use the example given in that article, two grains of sand does not make a heap, but ##10^{22}## grains certainly would. Yet there is no precise number of grains of sand where you transition between "heap" and "not heap".

To be more precise, a more defensible view of the sorites paradox is that of fuzzy logic. An empty surface (or if you are contemplating gases, an empty volume) is not much like a heap, and a single grain is only a little more like a heap. Two grains are a lot more like a heap, especially if one is on top of the other, and three still more. Whether thirty or thirty million are much like a heap might depend on whether they are in a flat layer or not touching, etc. You could extend the analogy to gases by contemplating He molecules in your space under consideration. The view becomes more complicated as the conditions become more crowded, say when your pile of sand starts to fuse under gravitational pressure, becoming a single grain that we might elect to call a planet, or sufficiently cold helium under sufficient pressure liquefies. Those two items are once again less like gases, rather than absolutely unlike gases.

And thus the sorites paradox evaporates under the influence of fuzzy logic.

You could of course apply particular definitions to say why each of my limiting cases is not a pile, or not a gas, but then your sorites paradox suddenly revives in full health and vigour, and once more looks more like a sorites paradox. You can't have it both ways.
 
  • #26
Wolfenstein3d said:
For ex. if two particles close to each other require n bits of info to describe them, why does it take n bits to describe them when they are far apart?
Wolfenstein3d said:
So if you have 2 hydrogen atoms bouncing around in a otherwize empty box, at a certain point when adding more H atoms the system becomes a gas?
It is not to do with any particular number of particles. Rather, it is how the observer chooses to group microstates into macrostates.
Suppose you have a box 1000 cells in each direction, and two particles that can independently be in any of the 109 cells.
In general, there are 1018 possible states (if the particles are distinct), but it takes less information to describe an arrangement in which the two particles are within 10 cells of each other, in each dimension, because you can choose to describe it in terms of that block of 1000 cells (109 possibilities) and their relative positions within the block (106 possibilities). [Yes, I know that's grossly overcounting - it's more like 2x1012 all up.]
 
  • #27
The original post got me thinking about free expansion of an ideal gas. If an ideal gas of N atoms that's thermally isolated from the surroundings expands freely to twice the starting volume the temperature stays the same. That means we can calculate the thermodynamic entropy change as if the expansion was reversible and isothermal and we get N k ln 2. We can also get the same result for the entropy change using Boltzmanns formula for statistical entropy.
If we now assume that we can describe this gas (as suggested by Wolfenstein3d) by a list of 6 floating point numbers per particle (for position and velocity) the information needed to describe the gas doesn't change when the gas expands by a factor of 2.
What is the conclusion from this?
A: Describing a gas in this classical way is not correct (because the particles are indistinguishable, quantum mechanical, non-classical ...).
B: There is no connection between statistical entropy and information.
Other suggestions?

Now let's assume a gas of particles that can be described by 6N numbers just like in classical mechanics. Think of super-heavy, hypothetical atoms or atom clusters consisting of millions of protons and neutrons. It seems to me that something has to change now. Now the information needed to describe this system is clearly independent of the volume. If that means that the entropy is also constant during free expansion then the temperature cannot be constant. The free expansion would have to be equivalent to a reversible adiabatic expansion and the temperature would decrease.
Are there any systems like this where a free expansion leads to a temperature decrease?
 
  • #28
Philip Koeck said:
we get N k ln 2. We can also get the same result for the entropy change using Boltzmanns formula for statistical entropy.
If we now assume that we can describe this gas (as suggested by Wolfenstein3d) by a list of 6 floating point numbers per particle (for position and velocity) the information needed to describe the gas doesn't change when the gas expands by a factor of 2.
If you try to use "floating point numbers", you only need one to fully describe the whole Earth and still have plenty of room in there, because a "floating point number" contains infinite amount of information.
You should use rational or integer numbers or any other numbers with fixed resolution. In that case, for each particle, you're adding exactly 1 bit of information: is the particle in the original, or in the added volume? After that, the information required to describe the position, speed, vibrations etc. is the same as before. So you get N k ln 2 of entropy increase ("unfortunately" entropy is not measured in bits).
Sackur-Tetrode formula may be of interest.

I'm not sure what to take from the second part of your post. Internal energy cannot decrease if you're not taking energy from the gas, but temperature of a non-ideal gas can. See Joule-Thomson effect.
 
  • Like
Likes Philip Koeck
  • #29
SlowThinker said:
a "floating point number" contains infinite amount of information.

No, it doesn't. If it did, it couldn't be represented by a finite number of bits, which our computers do all the time.

A floating point number contains some finite number of bits describing the mantissa, and some finite number of bits describing the exponent. That's a finite amount of information. The fact that the exponent varies the "resolution" doesn't mean there is infinite information in the number.

What you might be trying to describe here is real numbers, which do have "infinite resolution"; but real numbers are not the same as floating point numbers.
 
  • Like
Likes Philip Koeck
  • #30
PeterDonis said:
No, it doesn't. If it did, it couldn't be represented by a finite number of bits, which our computers do all the time.

What I actually meant by floating point was 4 byte fp as it is stored in the computer. Obviously those numbers are always rounded and have finite resolution. My argument is that if you can describe each particle by six such numbers at a certain time you have all the information you need to predict the positions and velocities of all particles at a later time. Obviously this prediction will be inaccurate due to the rounding. To me it seems that this information doesn't change drastically when the volume is suddenly doubled. At least for not too distant times the prediction will be almost as accurate. Clearly this way of measuring information depends on the distinguishability and trackability of the particles. It looks like information (measured as described) doesn't change a lot whereas entropy does (according to experiment). The conclusion could be that describing gas particles by 6 numbers each doesn't make sense because they are indistinguishable.
 
  • #31
SlowThinker said:
I'm not sure what to take from the second part of your post. Internal energy cannot decrease if you're not taking energy from the gas, but temperature of a non-ideal gas can. See Joule-Thomson effect.

Yes, of course. I didn't see that. For an ideal gas the temperature has to be constant during free expansion since the inner energy is constant. That would mean that a free expansion with constant entropy contradicts the first law.
 
  • #32
SlowThinker said:
If you try to use "floating point numbers", you only need one to fully describe the whole Earth and still have plenty of room in there, because a "floating point number" contains infinite amount of information.
You should use rational or integer numbers or any other numbers with fixed resolution. In that case, for each particle, you're adding exactly 1 bit of information: is the particle in the original, or in the added volume? After that, the information required to describe the position, speed, vibrations etc. is the same as before. So you get N k ln 2 of entropy increase ("unfortunately" entropy is not measured in bits).
This might be the answer I'm looking for. If I understand correctly you are saying that whenever I've described the particles with a limited number of bits I will always need 1 more bit per particle when the volume is suddenly doubled since I need to state whether the particle is at some position in the original half of the container or in a corresponding position in the new half. So a doubling of the volume changes the information needed to describe the whole gas by N bits, in agreement with the increase of entropy. This should be true no matter whether particles are distinguishable or not.
 
  • Like
Likes SlowThinker
  • #33
Philip Koeck said:
What I actually meant by floating point was 4 byte fp as it is stored in the computer. Obviously those numbers are always rounded and have finite resolution. My argument is that if you can describe each particle by six such numbers at a certain time you have all the information you need to predict the positions and velocities of all particles at a later time. Obviously this prediction will be inaccurate due to the rounding. To me it seems that this information doesn't change drastically when the volume is suddenly doubled. At least for not too distant times the prediction will be almost as accurate. Clearly this way of measuring information depends on the distinguishability and trackability of the particles. It looks like information (measured as described) doesn't change a lot whereas entropy does (according to experiment). The conclusion could be that describing gas particles by 6 numbers each doesn't make sense because they are indistinguishable.
I would say the second part (starting with "To me it seems ...") of my post quoted above is wrong. The extra information needed when doubling the volume should actually be 1 bit per particle.
 
  • #34
Wolfenstein3d said:
Like entropy increases if two particles spread out. Why does that mean information has increased when the positions (xyz) only take 3 coordinates in both former and latter cases?
I might have an entirely classical argument that the amount of information needed to describe a collection of particles does increase when the available volume increases. The important point is how you measure the information needed to describe the particles. In classical mechanics you would say that you need six numbers (three coordinates and three velocity components) to completely specify one particle. For N particles you obviously need 6N numbers and it doesn't matter how big N is! If only elastic collisions happen you have all the information needed to predict where every particle will be and how it will move for the rest of time assuming that your numbers have infinite accuracy. The latter means, of course, that you have infinite information.
Now let's make it more realistic and assume that every coordinate and velocity component has limited accuracy. If we want to specify for example the x-coordinate of a particle in a box of size L in the x-direction we can start by saying whether it's in the left or the right half. That requires 1 bit of information. To improve the accuracy we can add 1 bit which specifies whether the particle is in the left or the right half of the half specified by the first bit. We continue adding bits until we've reached the required accuracy. The number of bits we needed is the information. Note that every number we can store in a computer has an inbuilt accuracy.
Now comes the key thought: If we now double the volume by making the length of the box 2L we need one extra bit in order to specify the x-coordinate of one particle with the same accuracy. That means the total information needed to specify the gas (or collection of particles) increases by N bits when the volume is doubled.
 
Last edited:

FAQ: If entropy increases, where does the information come from?

What is entropy and how does it relate to information?

Entropy is a measure of disorder or randomness in a system. As entropy increases, the amount of information decreases because the system becomes less organized and predictable.

How can entropy increase if the laws of thermodynamics state that it should always decrease?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. However, this does not mean that entropy cannot decrease in certain parts of the system. For example, the decrease in entropy in one part of the system is offset by an increase in another part, resulting in an overall increase in entropy.

Where does the information come from when entropy increases?

The information that is lost as entropy increases is not physically destroyed, but rather it becomes more dispersed and less organized. This information is still present in the system, but it becomes more difficult to extract and use.

Can entropy ever decrease or be reversed?

In a closed system, the total entropy will always increase. However, in an open system where energy and matter can enter and leave, it is possible for local decreases in entropy to occur. For example, living organisms are able to decrease their own entropy by taking in energy from their environment.

How does entropy relate to the arrow of time?

Entropy is closely linked to the arrow of time, which describes the unidirectional flow of time from the past to the future. As entropy increases, the system becomes more disordered and the amount of information decreases, making it more difficult to reverse the process and go back in time.

Similar threads

Replies
57
Views
4K
Replies
1
Views
1K
Replies
15
Views
2K
Replies
4
Views
1K
Replies
7
Views
2K
Replies
1
Views
1K
Back
Top