- #1
- 1,598
- 605
I was reading about entropy, Poincare recurrence theorem and the arrow of time yesterday and I got some ideas/questions I'd like to share here...
Let's think a about a system that is a classical ideal gas made of point particles, confined in a cubic box. Suppose that at time ##t=0## all the particles are in the same half of the box. Now, there will quickly occur a distribution of the gas to fill the whole box uniformly, consistent with the second law of thermodynamics. However, from the Poincare recurrence theorem, we know that after some huge time interval, the system will again temporarily be in approximately the same initial situation where all the gas occupies the same half of the container.
Or let's take an even simpler example, a set of uncoupled harmonic oscillators with one degree of freedom. The time evolution of the oscillators can be described with a set of sine functions of t, like say:
##\sin (t), \sin (\alpha t), \sin (\alpha^2 t), \sin (\alpha^3 t), ... , \sin (\alpha^n t)##
and ##\alpha## is an irrational number larger than ##1## to prevent the phase space trajectory of the system from being periodic. To keep this just as a mathematical toy, I'm not even giving the functions a dimensional amplitude.
At time ##t=0##, all the functions have value zero. We can intuitively call this initial state a "state of low entropy", even though there really isn't such a thing as the entropy of a single microstate. When ##t## increases, the sine functions quickly get all kinds of values between ##-1## and ##1## and you wouldn't immediately recognize any correlation between the values unless you knew the form of the functions above. The system has now gone to a "state of high entropy".
After some time, the length of which depends on n, the number of oscillators, the system will be almost back in the initial state, because of the Poincare recurrence.
Now, I have two questions:
1. How to define an entropy-like variable that would be defined for a single microstate (i.e. the set of values of those sine functions at some ##t##), and that would have intuitively right properties, like having a small value if all the values are equal, and a high value if they appear pseudorandom?
2. Is it possible to assign the oscillators a set of phase velocities, initial phases, and amplitudes, in such a way that there would never occur an accidental state of "low entropy", at any value of t. I.e. the values of the sine functions would be an apparently pseudorandom mix of all values between -1 and 1 at any moment of time. Or is the phase space trajectory of such a system a "space filling curve" that eventually explores all possible states with the same total energy, even low entropy ones?
This is just a thought experiment where I'm trying to construct a system where there never happens anything that would give the arrow of time a preferred direction...
Let's think a about a system that is a classical ideal gas made of point particles, confined in a cubic box. Suppose that at time ##t=0## all the particles are in the same half of the box. Now, there will quickly occur a distribution of the gas to fill the whole box uniformly, consistent with the second law of thermodynamics. However, from the Poincare recurrence theorem, we know that after some huge time interval, the system will again temporarily be in approximately the same initial situation where all the gas occupies the same half of the container.
Or let's take an even simpler example, a set of uncoupled harmonic oscillators with one degree of freedom. The time evolution of the oscillators can be described with a set of sine functions of t, like say:
##\sin (t), \sin (\alpha t), \sin (\alpha^2 t), \sin (\alpha^3 t), ... , \sin (\alpha^n t)##
and ##\alpha## is an irrational number larger than ##1## to prevent the phase space trajectory of the system from being periodic. To keep this just as a mathematical toy, I'm not even giving the functions a dimensional amplitude.
At time ##t=0##, all the functions have value zero. We can intuitively call this initial state a "state of low entropy", even though there really isn't such a thing as the entropy of a single microstate. When ##t## increases, the sine functions quickly get all kinds of values between ##-1## and ##1## and you wouldn't immediately recognize any correlation between the values unless you knew the form of the functions above. The system has now gone to a "state of high entropy".
After some time, the length of which depends on n, the number of oscillators, the system will be almost back in the initial state, because of the Poincare recurrence.
Now, I have two questions:
1. How to define an entropy-like variable that would be defined for a single microstate (i.e. the set of values of those sine functions at some ##t##), and that would have intuitively right properties, like having a small value if all the values are equal, and a high value if they appear pseudorandom?
2. Is it possible to assign the oscillators a set of phase velocities, initial phases, and amplitudes, in such a way that there would never occur an accidental state of "low entropy", at any value of t. I.e. the values of the sine functions would be an apparently pseudorandom mix of all values between -1 and 1 at any moment of time. Or is the phase space trajectory of such a system a "space filling curve" that eventually explores all possible states with the same total energy, even low entropy ones?
This is just a thought experiment where I'm trying to construct a system where there never happens anything that would give the arrow of time a preferred direction...