Why is Decoherence Needed in Quantum Mechanics?

In summary, decoherence is a natural part of quantum mechanics and is not unique to any specific interpretation. It explains how a macroscopic object can behave classically even though it is made up of quantum particles. While experiments in quantum optics may not be the best example to understand decoherence, studying other systems such as an electric RCL circuit can provide a clearer understanding. Decoherence can be modeled using a hamiltonian and a Lindbladian, which accounts for the decay of the system.
  • #71
zonde said:
As I understand, wavefunction collapse originally referred to single quantum system and the outcome of collapse was single eigenstate of operator. In that case it mostly covers what I had on mind.
But in context of decoherence wave function collapse have somehow morphed into something else. In particular how superposition of states is converted into density matrix. So it describes ensembles rather than individual systems.

I like to divide what you're talking about here into two parts.

Firstly, is the question of why an entirely deterministic theory only allows us to give probabilistic predictions of possible outcomes. From a MWI perspective, the answer is trivial, in that measurement involves splitting into separate worlds and the probabilities of QM reflect that we find ourselves in only one of those.

Secondly, is question of why we see the part of the Hilbert space that we do as real and the rest as other possible futures. This is a fascinating discussion in its own right, but I'm not touching it on this forum. It causes too much upset and the forum isn't equipped to moderate such discussion. Here, just accept it axiomatically.
 
Last edited:
Physics news on Phys.org
  • #72
Nugatory said:
You're asking about a marble. A marble is not a point particle described by a wave function ##\psi(x)## such that the probability of finding it at position ##x## is given by ##|\psi(x)|^2##. Instead, a marble is made up of about ##5\times{10}^{22}## molecules of silicon dioxide (I get that number by assuming that the marble is solid glass, weighs about ten grams, and rounding off enough to do the arithmetic in my head) and its wave function is the product of the product of the wave functions of each of these individual particles in the position basis.

I thought the preferred basis was defined over the whole classical object. Are you sure it's for each individual particles? But if it is.. the entire object won't jab into coincidental positions.. for example the cat eye would in the foot. Can anyone else confirm if preferred basis is for the whole macroscopic object or individual particle?
What is the chance of all of these molecules all randomly jumping in the same direction by the same amount at the same time? You are as likely to see a scrambled egg unscramble itself and separate back into white and yolk as it is stirred.

This situation really isn't that different from the way that in classical mechanics the random movements of gas molecules average out in such a way that (for example) any reasonably-sized volume of gas obeys Boyle's Law. In this case, the randomness that you're considering is quantum mechanical in origin, but it still averages out the same way.

Thus stevendaryl's point that "in general, the only basis that is feasible to use is the one in which the objects have definite positions at all times (at least up to some trivial uncertainty)".Better put a smiley next to that... or you'll find someone taking it seriously. :smile:

I think it's my sister taking some of my food to feed the cat :(
 
  • #73
craigi said:
Firstly, is the question of why an entirely deterministic theory only allows us to give probabilistic predictions of possible outcomes.
Answer is simple - because the theory is not entirely deterministic.It contains Born rule.
craigi said:
From a MWI perspective, the answer is trivial, in that measurement involves splitting into separate worlds and the probabilities of QM reflect that we find ourselves in only one of those.
MWI is a theory that is non local (split happens non locally), non realistic (it lives in some sort of hyper reality), non deterministic (our past does not form a causal chain) that can't be falsified. So I don't really care about the answers it gives. Sorry.
 
  • #74
zonde said:
Answer is simple - because the theory is not entirely deterministic.It contains Born rule.

But there is a certain sense in which that rule is not understood, physically. In classical probability, a probability is either interpreted as an uncertainty about the facts of the present (for example, we treat thermodynamics probabilistically because we don't know the precise positions and velocities of all [itex]10^{whatever}[/itex] particles), or it's interpreted as uncertainty about future choices (a stochastic process). Both options lead to problems for QM. The first option, that QM probability reflects lack of information about the detailed current state, is made awkward by Bell's theorem--there can't be such "hidden variables" unless there are nonlocal interactions. The second option, which is that QM probability reflects a stochastic process, is also difficult to make sense of, for two reasons:

  1. The "choice" made by such a stochastic process would necessarily be nonlocal (again, according to Bell's theorem).
  2. If you seriously consider stochastic evolution to be real, then that's in contradiction to the evolution according to Schrodinger's equation. So that would mean two kinds of evolution. There is no theory of how that could work--what deterrmines whether the stochastic process or the unitary, smooth evolution takes place. There are ad hoc suggestions, such as "consciousness collapses the wave function", but those are too fuzzy to actually be part of a proper scientific theory.
To me, there are inherent problems with interpreting the Born rule in light of smooth deterministic evolution of the wave function, regardless of whether you do that interpretation from within a MWI model or some other model.
 
  • #75
stevendaryl said:
But there is a certain sense in which that rule is not understood, physically. In classical probability, a probability is either interpreted as an uncertainty about the facts of the present (for example, we treat thermodynamics probabilistically because we don't know the precise positions and velocities of all [itex]10^{whatever}[/itex] particles), or it's interpreted as uncertainty about future choices (a stochastic process). Both options lead to problems for QM. The first option, that QM probability reflects lack of information about the detailed current state, is made awkward by Bell's theorem--there can't be such "hidden variables" unless there are nonlocal interactions. The second option, which is that QM probability reflects a stochastic process, is also difficult to make sense of, for two reasons:

  1. The "choice" made by such a stochastic process would necessarily be nonlocal (again, according to Bell's theorem).
  2. If you seriously consider stochastic evolution to be real, then that's in contradiction to the evolution according to Schrodinger's equation. So that would mean two kinds of evolution. There is no theory of how that could work--what deterrmines whether the stochastic process or the unitary, smooth evolution takes place. There are ad hoc suggestions, such as "consciousness collapses the wave function", but those are too fuzzy to actually be part of a proper scientific theory.
There is no way how to get rid of nonlocality in any realistic model.
Unless of course you consider possibility that QM have overlooked two different photon-matter interaction aspects that can create two different loopholes in photon Bell tests.

stevendaryl said:
To me, there are inherent problems with interpreting the Born rule in light of smooth deterministic evolution of the wave function, regardless of whether you do that interpretation from within a MWI model or some other model.
So, what is your approach to this problem?
 
  • #76
zonde said:
There is no way how to get rid of nonlocality in any realistic model.

And there is no very satisfactiory "non-realistic" model, either.

So, what is your approach to this problem?

I don't have an answer. But I object to people picking on MWI, because the difficulties with MWI are difficulties with QM, PERIOD. All the alternatives are unpalatable for various reasons.
 
  • #77
stevendaryl said:
I don't have an answer. But I object to people picking on MWI, because the difficulties with MWI are difficulties with QM, PERIOD. All the alternatives are unpalatable for various reasons.

Here's an answer:
http://arxiv.org/pdf/1405.7907v3.pdf
I'd recommend reading the Tegmark, Aguire paper first.

It may just be that quantum theory can't be a complete theory within our classical view of the cosmological horizon.

Alternatively, the Born rule can be derived under the Quantum Bayesian interpretation.
 
Last edited:
  • #78
craigi said:
Alternatively, the Born rule can be derived under the Quantum Bayesian interpretation.

Its fundamental basis is non-contextuality so you can apply Gleason. Quantum Bayesianism won't allow you to bypass that - in fact it specifically makes that assumption. Proponents of it like Fuchs use augments very similar to post 137 of the following thread:
https://www.physicsforums.com/threads/the-born-rule-in-many-worlds.763139/page-7

Thanks
Bill
 
  • #79
stevendaryl and the rest, there is a very obvious solution to the measurement problem.. it was proposed many decades ago by Bohm. It basically states:

Explicit Order: Collapsed
Implicate Order: Uncollapsed.. superposition..

This is different from Many Worlds, Copenhagen, Bohmian Mechanics, etc.

Why is this not taken seriously? Bohm said everything comes from Implicate Order. When it exfold, collapse occurs. Before collapse, everything is in the implicate order. This is a good physical picture to QM in general and wave function collapse in particular. What is the flaw in Bohm belief? What papers written about this? If none, why none? This is not any more bizarre than Many Worlds. Come to think of it.
 
  • #80
lucas_ said:
stevendaryl and the rest, there is a very obvious solution to the measurement problem.. it was proposed many decades ago by Bohm. It basically states:

Explicit Order: Collapsed
Implicate Order: Uncollapsed.. superposition..

This is different from Many Worlds, Copenhagen, Bohmian Mechanics, etc.

Why is this not taken seriously? Bohm said everything comes from Implicate Order. When it exfold, collapse occurs. Before collapse, everything is in the implicate order. This is a good physical picture to QM in general and wave function collapse in particular. What is the flaw in Bohm belief? What papers written about this? If none, why none? This is not any more bizarre than Many Worlds. Come to think of it.

There's two main reasons that it is unpalateable to physicists, firstly it opposes reductionism, secondly it assigns consciousness a special role.
 
  • #81
Still about decoherence. I am trying to sort out individual system/ensemble view of coherence.

We could think that coherence is ensemble property because in order to observe interference all clicks have to contribute to the same interference pattern. But this only says that complex phase difference is fixed between the states in superposition. And if state meaningfully describes single particle then any individual click can be viewed as physically independent and then presence of interference pattern say nothing about relations present in ensemble.
But it seems to me that homodyne detection shows something more in respect to my question. A coherent state shows a shift of the bump in Wigner function (http://www.iqst.ca/quantech/wiggalery.php). As Wigner function is reconstructed from many measurements from two ensembles it means there should be relations within ensembles in order to see asymmetric changes in picture. And so coherence should be about relations within ensemble.

So does it make sense to say that Wigner function demonstrates relations within ensemble and not just relations in statistical collection physically independent events?
 
  • #82
I have looked up two sources for better understanding of decoherence.
First was Ballentine book. In chapter 9.3 "The Interpretation of a State Vector" he argues against interpretation of state vector as description of individual system.
If I understand correctly, "decoherence as a measurement" hypothesis is covered in his option (iii):
"The reduction (9.9) is caused by the environment, the “environment” being defined as the rest of the universe other than (I) and (II)."
Then he gives very brief argument that combination of two previous arguments defeats this hypothesis as well. And the two arguments are that:
- if we include disturbance from the start we can't create any difference form standard equations;
- if we include disturbance at any later time (when observation is made) we arrive at unfruitful speculations about the place of that later time.

Interesting thing to note was that Ballentine just as well believes that macroscopic superposition is in conflict with observations:
"Thus the interpretation of (9.8) as a description of an individual system is in conflict with observation."And the second one is Hooft's paper "Quantum Mechanics from Classical Logic" http://iopscience.iop.org/1742-6596/361/1/012024
For a start Hooft seems to experess different viewpoint about problem of superposition:
"Although quantum mechanics is generally considered to be fundamentally incompatible with classical logic, it is argued here that the gap is not as great as it seems."
Then as I understand he argues that decoherence leads to hidden variable theory if environment is included in initial state:
"If we do not know the initial state with infinite accuracy then we won’t be able to predict the final state any better than that. The probabilistic distribution at t = 0 determines the probabilistic distribution at all later times."
Any comments?
And in particular about this Hooft's statement "quantum mechanics is generally considered to be fundamentally incompatible with classical logic". Could it be the real problem behind macroscopic superposition?
 
  • #83
durant35 said:
The only barriere is the fact that Zurek and other authors mention decoherence time so often, like superpositions occur all the time and they get destroyed which isn't the same like the cat that is decohered since its birth and alive. What do they mean really? I'm sure you have insight.

Remember when I mentioned that if a quantum object has definite position it will spread. When that happens it will then become entangled with things and decohere again. Its a continuous process that happens very fast.

Zureck's approach is different to the standard approach. He takes the QM state as given and develops observations by decoherene. I am not expert in it enough to go beyond that.

Thanks
Bill
 

Similar threads

Back
Top