- #1
nouveau_riche
- 253
- 0
can a particle hold it's information at absolute zero?
Bill_K said:Temperature is a property of multiparticle systems. A single particle does not have a temperature.
Demystifier said:Your question cannot be answered before defining information first.
01030312 said:Lets assume information is everything we know about the system. Then, getting down to absolute zero will decrease the entropy and information will increase. If you are curiour about the link between information and thermodynamics, check out Landauer's principle- "information is physical".
Elemental Jen said:since when has any particle achieved absoute zero?
theoretically if say an atom achieved absolute zero it would hold information. wouldn't be very much information though as only the ground quantum state would be occupied and could be measured.
01030312 said:Then, as is usual in thermodynamics, this passage will lead to some information containing objects being 'randomized' or lost. It is a basic behaviour of chaotic systems, like colliding spheres. Now, as you know, being at finite temperature, this gate increases the entropy of the system. So how to connect this with information?
Here, something known as algorithmic information theory is used. This theory describes which number is specifiable by a human and which is not. Its application to thermodynamics is that in a chaotic system like a box of molecules, soon a state will reach which is not specifiable by humans. So your information will fly away and many possible microstates will represent one macrostate. Thus lack of knowledge, or entropy of the system will increase.
This should not be very casually compared with shannon's theory of information, where a particular message was "neglected" to include the whole family of messages produced by source, and this was defined as information content.
Finally, it is difficult to say what effect algorithmic information theory will have at absolute zero in a quantum system, since a theory of quantum chaos is still absent.
what you are highlighting is the effect of interconnection between systems,which spread to universe01030312 said:It starts this way- consider all particles at an initial moment 0. Suppose their coordinates are collectively specified by a set of numbers. Choose one of the numbers, which denotes one of the particles. Algorithmic information theory says that almost all numbers are random, that is, they can't be specified by any conceivable machine to arbitrary accuracy. So you must specify an approximation the number you chose- this necessity occurring with 'probability 1'.
Suppose you do it. Then the numbers differ by some decimal place- such difference may look like 0.000...ntimes...93743749.. Now the particle starts collisions and this collision is chaotic, that is the nearby trajectories deviate exponentially with some time interval. This means, the difference above gets closer to 1.000000... by one decimal place each moment. After n time intervals, the difference will become 0.9374349... But the number I just wrote is uncomputable, hence random (You can see the cotradiction if it were computable). Thus the path of particle becomes unknown/random in strict mathematical sense. This is what is meant with loss of information in thermodynamic system. Finally you will be able to talk about macrostates only.
An another type of information loss is following- assume a particle is coming towards yu. Suddenly it enters a jar of gas in thermal equilibrium. Then things of above paragraph will happen and this information will be lost. Analogous things happen in computers due to which computer looses heat. This type of loss of information was studied by Charles Bennett and is a part of the principle "information is physical" (statement is due to Rolf Landauer). Charles Bennett then showed that information loss can be avoided and its the basis of reversible computation.
Absolute zero is the lowest possible temperature that can be achieved, at which point the particles of matter have minimal kinetic energy and cannot be further cooled. It is equivalent to 0 Kelvin or -273.15 degrees Celsius.
At absolute zero, entropy (or disorder) is at its minimum, meaning that there is no randomness or uncertainty. This makes it an important concept in information theory, as it represents the theoretical limit of information storage and processing.
It is impossible to reach absolute zero in practice, as it would require removing all energy from a system, which is not physically feasible. However, scientists have been able to cool matter to within a fraction of a degree above absolute zero.
At absolute zero, matter takes on unique properties, such as becoming superconductive and exhibiting zero electrical resistance. It also undergoes a phase transition, meaning it changes from one state of matter to another, such as from a gas to a liquid or solid.
The third law of thermodynamics states that it is impossible to reach absolute zero through a finite number of steps. This is because as a system approaches absolute zero, the amount of energy required to cool it further becomes infinitely large. This law has significant implications for the behavior of matter at extremely low temperatures.