Small scale entropy and information loss

In summary, The 2nd law of thermodynamics states that in any natural process, the total entropy of a closed system will always increase. Entropy is a measure of the hidden information within a system, and it is only a macro concept. In the case of a collision between two particles, the conservation of momentum leads to the loss of specific information about the initial momenta of the particles. This can be seen as an increase in entropy, as the specific values of p1 and p2 cannot be observed post-collision. However, the overall quantity of information remains the same. This thought process also brings up the concept of reversibility and Loschmidt's paradox.
  • #1
anorlunda
Staff Emeritus
Insights Author
11,308
8,744
I'm struggling to understand the implications and origins of the 2nd law.

Entropy is such a slippery subject. Wikipedia has many many definitions of entropy. I've been studying Professor Susskind physics lectures, so I'm most interested in his favorite definition: that information is conserved but entropy is a measure of hidden information. I also know that entropy is only supposed to be a macro concept, but I can't help thinking about it on the micro level.

Suppose we have two particles that come together and merge. Particle 1 with momentum p1 and particle 2 with momentum p2. After collision the product is particle 3 with momentum p3 and particle 4 with momentum p4. It should work with any particles, but I'm thinking of electrons and photons.

Let us say that the collision conserves mass, energy and momentum, but I'll only consider momentum.

Conservation of momentum gives us the vector equation (p3+p4)=(p1+p2). p3 and p4 are observable. However, there are an infinite number of values of p1 and p2 that satisfy (p3+p4)=(p1+p2). The information of which specific p1 and p2 we started with is lost, or at least hidden. Right?

Does the entropy of this system increase because of the collision? On one hand, the quantity of information seems the same. i.e. the values of p1 and p2 pre-collision versus the values of p3 and p4 post-collision. However, the specific values of both p1 and p2 can not be observed post collision. In other words, the number of bits needed to describe the momenta seems conserved, but the message encoded by the pre-collision bits is destroyed.

The next logical step is to consider time reversal symmetry and reversibility of the 2nd law, which led me to Loschmidt's paradox. I confess the the explanations in that article are beyond my understanding, so I'll limit my question.

Is this way of thinking about particles, information and entropy valid?
 
Science news on Phys.org
  • #2
In general, that classical (!) collision of point-like particles (!) will conserve the information content. Afterwards, you don't know p1 and p2 - but before, you don't know p3 and p4, so you have the same amount of knowledge.
 

FAQ: Small scale entropy and information loss

What is small scale entropy?

Small scale entropy is a measure of the randomness or disorder within a specific system or process. It is often used to analyze and quantify the amount of information contained within a system.

Why is small scale entropy important?

Small scale entropy is important because it provides insight into the complexity and unpredictability of a system. It can be used to identify patterns or anomalies within a system, and can help in understanding the overall behavior and functioning of a system.

How is small scale entropy calculated?

The calculation of small scale entropy involves determining the probability of each state or event within a system, and then using this information to calculate the overall entropy. This can be done using mathematical formulas or through computer simulations.

What is information loss?

Information loss is the reduction or elimination of information within a system. This can occur due to various factors such as data compression, system noise, or error-prone communication channels.

How does small scale entropy relate to information loss?

Small scale entropy is often used to measure the amount of information contained within a system. Therefore, a decrease in small scale entropy can indicate a loss of information within the system. However, it is important to note that small scale entropy is not the only factor that contributes to information loss.

Back
Top