- #36
- 2,361
- 339
I actually have a slight mistake in my characterization of Hurkyl's example. We shouldn't use a density operator to express our knowledge of the system, post measurement but pre-acknowledgment of measured value. Rather we should use a classical discrete pdf of the two "collapsed" modes. The reason being an entropy calculation on the density operator would yield non-zero entropy whereas having access to a record of the measurement outcome implies zero entropy. It should be even more in parallel with the die case. The case where we actually use a density operator we should be sure that the classical probabilities stem from correlation (entanglement) of the observable with variables in the entropy dump which are thereby inaccessible. The classical probabilities in the density operator stem from tracing over the inaccessible parts of the environment.
I believe this is just a technical qualification of the notation we could choose to utilize the same mathematical object, the density operator to represent both kinds of "classical" probabilities but with the type distinction made explicit. One is dealing with the same kind of entropy issues which arise in interpreting probabilities in a purely classical setting. Maxwell's demon is starting to rear his little pointy head.
I'll have to consider what this means in terms of my understanding of entropy. Hmmm...
I think essentially the CI must extend also in the classical realm. We should understand the meaning of entropy as expressing not a property of the system but a property of our knowledge about the system... said knowledge being of course linked via the empirical epistemology of physics to the physical constraints we impose in defining (physically setting up and example of) the system.
Hmmm... this would make the second law of thermodynamics an abstract law about information. I'll meditate on this.
I believe this is just a technical qualification of the notation we could choose to utilize the same mathematical object, the density operator to represent both kinds of "classical" probabilities but with the type distinction made explicit. One is dealing with the same kind of entropy issues which arise in interpreting probabilities in a purely classical setting. Maxwell's demon is starting to rear his little pointy head.
I'll have to consider what this means in terms of my understanding of entropy. Hmmm...
I think essentially the CI must extend also in the classical realm. We should understand the meaning of entropy as expressing not a property of the system but a property of our knowledge about the system... said knowledge being of course linked via the empirical epistemology of physics to the physical constraints we impose in defining (physically setting up and example of) the system.
Hmmm... this would make the second law of thermodynamics an abstract law about information. I'll meditate on this.