- #1
sshai45
- 86
- 1
Hi.
I have heard this, that entropy is often called "disorder" but _isn't really so_. And I am even more puzzled by the connection and difference between entropy from "information theory" pov and from "thermodynamic" pov. I see stuff like this:
http://arstechnica.com/civis/viewtopic.php?f=2&t=3122
see the posts of "kmellis" and he simultaneously says that entropy in thermodynamics is NOT informational disorder, while simultaneously advocating an "information theoretic" basis for physics. How the f--- do you do that and then since thermo. entropy != inform. entropy, how do they relate in such a framework or not relate?
I'm curious. How can one prove mathematically that the entropy change in, say, converting an "ordered" stack of identical objects -- a more simplified version of an often given and apparently invalid example -- to a "messy" looking pile, is identically zero if you assume all other variables (temperature, etc.) are mathematically constant, so there is literally nothing but the rearrangement going on? I know, this is highly idealized, but that's the point, to isolate the "disordering" in a common man's sense and show that it has absolutely zero effect on the entropy of the whole system of objects. What I am wondering about is why couldn't there be some immeasurably small but not zero entropy change because after all you are rearranging the matter in the system, just not by a very great degree when you think of things on a "microscopic" scale?
I have heard this, that entropy is often called "disorder" but _isn't really so_. And I am even more puzzled by the connection and difference between entropy from "information theory" pov and from "thermodynamic" pov. I see stuff like this:
http://arstechnica.com/civis/viewtopic.php?f=2&t=3122
see the posts of "kmellis" and he simultaneously says that entropy in thermodynamics is NOT informational disorder, while simultaneously advocating an "information theoretic" basis for physics. How the f--- do you do that and then since thermo. entropy != inform. entropy, how do they relate in such a framework or not relate?
I'm curious. How can one prove mathematically that the entropy change in, say, converting an "ordered" stack of identical objects -- a more simplified version of an often given and apparently invalid example -- to a "messy" looking pile, is identically zero if you assume all other variables (temperature, etc.) are mathematically constant, so there is literally nothing but the rearrangement going on? I know, this is highly idealized, but that's the point, to isolate the "disordering" in a common man's sense and show that it has absolutely zero effect on the entropy of the whole system of objects. What I am wondering about is why couldn't there be some immeasurably small but not zero entropy change because after all you are rearranging the matter in the system, just not by a very great degree when you think of things on a "microscopic" scale?