Mathematically what causes wavefunction collapse?

In summary, quantum mechanics predicts wave function collapse, which is a heuristic rule that arises from the measurement problem. Many people dislike collapse because of this. There are numerous interpretations of QM which don't need collapse but all of them are weird some other way.
  • #71
bhobba said:
zonde said:
If you say that measurement outcome is described by probability you say that the rule applies to individual event (relative frequencies emerge from statistical ensemble of idependent events). So you contradict what Einstein was saying.

That's simply not true.

It purely depends on your interpretation of probability.
Interpretation does not change prediction, right? But if events are not independent we can get results that are quite different from predictions that are made using probabilities.

Do you agree?

As an example. Say we can have event + or - with equal probability (0.5). Now if we take series of events in a large sample we would expect that there will be series like ++++++++++ or ----------. And we can calculate how big a sample should be to expect series like that with say 99.99% probablity.
But if events are not independent it is possible that series like ++++++++++ or ---------- can never appear (probability 0%) while relative frequencies for + and - is still 0.5 and 0.5.
 
Last edited:
Physics news on Phys.org
  • #72
zonde said:
But if events are not independent we can get results that are quite different from predictions that are made using probabilities.

Do you agree?

No.

Probability theory deals with correlated events perfectly well.

However, if you naively compute probabilities based upon an incorrect assumption of independence then your prediction will indeed be incorrect.

In fact, it's commonplace in physics to account for correlations to get the correct confidence interval for measurements.

See http://en.wikipedia.org/wiki/CovarianceIt's also worth noting that correlated probabilities in quantum mechanics and not just relevant to random errors in experiments, they're actually fundamental to the theory. If there were a problem with the prediction of quantum mechanics with respect to correlated events, somone would've definitely noticed by now!
 
Last edited:
  • #73
zonde said:
Interpretation does not change prediction, right?

Of course it doesn't.

But what it does do is change how you view it.

And indeed there is an assumption made in the Ensemble interpretation, and even the frequentest interpretation of probability, each trial is independent.

Its from the law of large numbers:
http://en.wikipedia.org/wiki/Law_of_large_numbers
'the expected value is the theoretical probability of success, and the average of n such variables (assuming they are independent and identically distributed (i.i.d.)) is precisely the relative frequency'

In modern times, as I have mentioned previously, the frequentest interpretation of probability is justified by the Kolmogorov axioms to remove any kind of circularity. As a byproduct it also justifies the Baysian view showing they are really different realizations of basically the same thing.

Thanks
Bill
 
  • #74
craigi said:
This would actually be pretty easy to construct a viable deterministic hidden variable theory for.
... Random particle decay. OK then, what sort of hidden variable (short of an inbuilt random number generator! ) do you think could achieve that?:devil:
 
  • #75
Random radioactive decay always has troubled me. I can handle the probabilistic nature of the wavefunction but decay has always bothered me.
 
  • #76
Superposed_Cat said:
Random radioactive decay always has troubled me. I can handle the probabilistic nature of the wavefunction but decay has always bothered me.
There are much more troublesome issues to be resolved especially wrt the foundations and spontaneous decay isn't one of them.
 
  • #77
Jilang said:
... Random particle decay. OK then, what sort of hidden variable (short of an inbuilt random number generator! ) do you think could achieve that?:devil:

I'm not sure what's a plausible mechanism for particle decay, but there is no difficulty conceptually with assuming that it's deterministic. A sophisticated enough pseudo-random number generator, for example, is indistinguishable from a nondeterministic process.

What's difficult to accomplish with hidden variables is, as someone already pointed out, entanglement between distant subsystems.
 
  • #78
stevendaryl said:
What's difficult to accomplish with hidden variables is, as someone already pointed out, entanglement between distant subsystems.

Well, it's certainly troubling me and the Cat!
 
  • #79
Jilang said:
... Random particle decay. OK then, what sort of hidden variable (short of an inbuilt random number generator! ) do you think could achieve that?:devil:

A pseudo random number generator.
http://en.wikipedia.org/wiki/Pseudorandom_number_generator

To be clear, I'm not arguing for a hidden varible theory, only that the decay of a particle is far from the greatest challenge for such a theory.
 
Last edited:
  • #80
Superposed_Cat said:
Random radioactive decay always has troubled me. I can handle the probabilistic nature of the wavefunction but decay has always bothered me.
They are not really 'particles' as you seem to imagine. The particle concept is a handy approximation. That's why spontaneous decay should be the last thing that bothers you. If this world were made of particles, atoms would have collapsed less than a second after the BB(less than a second after they were formed - some thousand years after the BB).
 
Last edited:
  • #81
Just thought I'd add here the clearest argument I've seen for "there is no problem in quantum mechanics". It will, of course, satisfy no one, but it is the clearest I've seen:

http://arxiv.org/abs/1308.5290
 
  • #82
I get that there is not really a problem per say with anything, I just have a minor problem with everything being based off probability. It used to be soothing to me last year but now it bothers me, and that decay is literally based off randomness(well exponential decay).
 
  • #83
Superposed_Cat said:
I get that there is not really a problem per say with anything, I just have a minor problem with everything being based off probability. It used to be soothing to me last year but now it bothers me, and that decay is literally based off randomness(well exponential decay).

I think once you get your head around the fact that determinism can emerge from indeterminism and vice versa, it doesn't seem that weird anymore. It happens in gases, weather systems and even economics, to name but a few.

At the moment, I'm not even sure that I see the concepts of determinism and indeterminsm as all that distinct anymore. Perhaps all we really have is a continuous scale with things that seem indeterministic at one end and things that seem deterministic at the other.
 
  • #84
I understand that, hence me previously being okay with it.
it's just that me and my friend were talking about the weirdness or things like the wavefunction, eulers theorem (we don't like complex numbers), t=0 of the big bang ect. It just bothers me that there are certain things we can't know as a result of physics.

Before discovering physics I accepted that you couldn't know everything in practice, but I don't like that we can never know certain thing regardless.
 
  • #85
Superposed_Cat said:
I understand that, hence me previously being okay with it.
it's just that me and my friend were talking about the weirdness or things like the wavefunction, eulers theorem (we don't like complex numbers), t=0 of the big bang ect. It just bothers me that there are certain things we can't know as a result of physics.

Before discovering physics I accepted that you couldn't know everything in practice, but I don't like that we can never know certain thing regardless.

Sometimes a question seems rational and but may in fact, be a meaningless question. That is not to say that it's wrong to ask it, only that question happens to have an illogical inconsistency already within it, that may not be immediately apparent.

The simplest example that I can think of to illustrate this is the question:

"what's north of the North Pole?"

Initially you may think that "nothing" is the correct answer, but when you think about it, the question is presuming there can exist more north than the maximum amount of north.

Another example might be:

"A man is standing somewhere in a room. What's in his lap?"
[If you're not a native english speaker, then "lap" may not translate too well.]

Again, if you're to answer "nothing", you're complicit in validating the question. The correct response is "a standing man doesn't have a lap".

In neither of these cases is nature conspiring to prevent us from knowing something. There is nothing to know. It is simply that we're asking a meaningless question. The same is true in physics. Often we are so bound by our experiences of the everyday world that we struggle to accept that the concepts that we use in it are not universally applicable.
 
Last edited:
  • #86
PAllen said:
Just thought I'd add here the clearest argument I've seen for "there is no problem in quantum mechanics". It will, of course, satisfy no one, but it is the clearest I've seen:

http://arxiv.org/abs/1308.5290

Hmmm.

Interesting paper.

Have to say I agree with the following:
'Fifth, since neither decoherence nor any other mechanism select one particular outcome the whole “measurement problem” reduces to the question Why is there one specific outcome? which is asking Why are there randomly realized events? in the particular context considered. This harkens back to Sec. 1, where we noted that quantum theory cannot give an answer. In summary, then, the alleged “measurement problem” does not exist as a problem of quantum theory. Those who want to pursue the question Why are there events? must seek the answer elsewhere.'

Schlosshauer correctly identifies that as the key issue. Decoherence seems likely to answer all the other issues with the measurement problem - but that one it leaves untouched.

Is that a problem? Personally I don't know - I don't find it a worry - but I know others do.

What I do know is we have interpretations like DBB where it is not an issue at all and MWI where it has been replaced by something else. For me this suggests we have future surprises in store.

The following might be the beginnings of those surprises:
https://www.simonsfoundation.org/quanta/20130917-a-jewel-at-the-heart-of-quantum-physics/

Only time will tell.

Thanks
Bill
 
  • #88
PAllen said:
I have been interested in that from popular presentations like you link. Unfortunately (for me) there is a bunch I need to learn to try to understand this work in a meaningful way.

Indeed.

But, if what it reports is true, that they are replacing unitary evolution with something else it could have big consequences for the measurement problem - but of course only time will tell.

Thanks
Bill
 
  • #89
bhobba said:
Of course it doesn't.

But what it does do is change how you view it.
But it does not change the assumption that each trial is independent, right?

bhobba said:
And indeed there is an assumption made in the Ensemble interpretation, and even the frequentest interpretation of probability, each trial is independent.
That contradicts that Einstein quote about ensemble interpretation and QM being not applicable to individual systems (trials).
 
  • #90
zonde said:
But it does not change the assumption that each trial is independent, right?

Its the assumption of the law of large numbers.

zonde said:
That contradicts that Einstein quote about ensemble interpretation and QM being not applicable to individual systems (trials).

I have zero idea why you say that. Its simply not true.

The logic is dead simple. By the law of large numbers we can find an ensemble associated with an observation where the proportion of outcomes is the probability. This follows from simply assuming the outcome can be described probabilistically. The state is not even introduced at this point. The Ensemble Interpretation associates the state not with individual systems but with the ensemble. Its that easy. If you still don't get it I will have to leave it to someone else because I simply can't explain it any better.

Thanks
Bill
 
  • #91
bhobba said:
I have zero idea why you say that. Its simply not true.

The logic is dead simple. By the law of large numbers we can find an ensemble associated with an observation where the proportion of outcomes is the probability. This follows from simply assuming the outcome can be described probabilistically. The state is not even introduced at this point. The Ensemble Interpretation associates the state not with individual systems but with the ensemble. Its that easy. If you still don't get it I will have to leave it to someone else because I simply can't explain it any better.
I understand that part perfectly well. The part I don't understand is what in that (Ballentine's) interpretation changes if you associte it with individual system. And as I see it nothing changes if you say it's applicable to individual systems.
 
  • #92
zonde said:
I understand that part perfectly well. The part I don't understand is what in that (Ballentine's) interpretation changes if you associte it with individual system. And as I see it nothing changes if you say it's applicable to individual systems.

Got it now.

You face the discontinuous collapse issue if you think the state applies to an individual system and is in some sense real - that's the key point both Einstein and Ballentine didn't make clear in their objection. If its simply a level of confidence like the Baysian view of probability it doesn't matter one whit.

Thanks
Bill
 
  • #93
bhobba said:
Got it now.

You face the discontinuous collapse issue if you think the state applies to an individual system and is in some sense real - that's the key point both Einstein and Ballentine didn't make clear in their objection. If its simply a level of confidence like the Baysian view of probability it doesn't matter one whit.

Thanks
Bill

I don't understand how the ensemble approach avoids the discontinuous collapse issue. I'm not trying to be argumentative, but I just don't see it.
 
  • #94
stevendaryl said:
I don't understand how the ensemble approach avoids the discontinuous collapse issue. I'm not trying to be argumentative, but I just don't see it.

Its dead simple.

The interpretation assumes an observation selects an element from the conceptual ensemble. This is the sole purpose of the state in that interpretation. Nothing physical changed - the state simply refers to a conceptualization that with the observable determines the proportion of the outcomes in the conceptual ensemble.

To spell it out in excruciating detail given an observable and a state you can calculate the probabilities of the possible outcomes of the observation. This determines an ensemble of outcomes where the proportion of each outcome is the probability of that outcome. The interpretation assumes the observation simply picks a random element of the ensemble and that's the result. Since it all refers to just a conceptualization nothing physical changed.

To be even clearer apply it to throwing a coin. Its state is the vector 1/2, 1/2. Throw the coin and it picks a random entry from the ensemble that is half heads and half tales. The new state is now 0,1 or 1,0 depending if a head or tale came up. The state discontinuously changed - but so what - its just a conceptualization - an aid to figuring out the likelihood of an observation outcome.

Thanks
Bill
 
Last edited:
  • #95
bhobba said:
Its dead simple.

The interpretation assumes an observation selects an element from the conceptual ensemble.

That makes perfect sense for classical ensembles. You have a collection of systems that agree on the macroscopic variables (say, number of particles, or total energy, or something). But the details of how particles are moving differs from system to system. When you measure some quantity that varies from one system to another, nothing changes, you're just discovering which system (or sub-ensemble) is the "real" world.

You could try the same tactic with quantum nondeterminism: The quantity that you are measuring--angular momentum, for example--doesn't have a definite value before the measurement, simply because all you know is that the real world is one system out of an ensemble, and different members of the ensemble have different values for that observable. After the measurement, you haven't done anything other than identify which system (or sub-ensemble) is the real world.

But to assume that the system had a definite value for angular momentum before you measured it is a hidden-variables assumption, isn't it? Why don't Bell-type inequalities rule that out?
 
  • #96
stevendaryl said:
But to assume that the system had a definite value for angular momentum before you measured it is a hidden-variables assumption, isn't it? Why don't Bell-type inequalities rule that out?

One could assume that a quantum system has definite values for all variables at all times, and the only reason for nondeterminism is classical ignorance. One way to frame the results of the various mathematical no-go theorem (Bell's theorem, the Kochen-Specker theorem, etc.) is that if observables have definite values, then our ignorance about those values cannot be described using measurable sets.
 
  • #97
vanhees71 said:
The question, why Born's rule holds true and why the description of nature on a fundamental level is indeterministic is not asked in the realm of physics. You may wonder about it and try to find a simpler or more intuitive set of postulates defining quantum theory (e.g., Weinberg discusses at length, whether Born's postulate can be derived from the other postulates, i.e., the usual kinematical and dynamical postulates in terms of the Hilbert-space formulation with observable operators and state operators, coming to the conclusion that it cannot be derived), but as long as there is no empirical evidence against quantum theory, you better keep this theory.
This got me thinking... If such a question is not asked in the realm of physics in what realm should it be asked? I would not have thought that the philosophers would have the maths, the mathematicians probably not the inclination...

I didn't realize there was any mystery about Born's postulate. Isn't it just the joint probability of something coming one way meeting something coming the other way in imaginary time?
 
  • #98
bhobba said:
Got it now.

You face the discontinuous collapse issue if you think the state applies to an individual system and is in some sense real - that's the key point both Einstein and Ballentine didn't make clear in their objection.

I would say that this quote clarifies Einstein's point:
"For if the statistical quantum theory does not pretend to describe the individual system (and its development in time) completely, it appears unavoidable to look elsewhere for a complete description of the individual system; in doing so it would be clear from the very beginning that the elements of such a description are not contained within the conceptual scheme of the statistical quantum theory." - http://www.marxists.org/reference/archive/einstein/works/1940s/reply.htm

I would say that basically the point is that details (or interpretation) of collapse is outside the scope of QM and in statistical interpretation we speak only about relative frequencies without going into details.

Well apart from that it looks very much like non-contextual (or intrinsic to particle) LHV approach as he speaks about complete description of the individual system as a "complete" version of quantum theory.
 
  • #99
stevendaryl said:
But to assume that the system had a definite value for angular momentum before you measured it is a hidden-variables assumption, isn't it? Why don't Bell-type inequalities rule that out?

This is the Achilles Heel of the ensemble interpretation - its an ensemble of system and observational apparatus combined. Nothing is assumed about the value of any observable prior to observation.

Ballentine in his 1970 paper on it more or less stated he was assuming some kind of hidden variable so it was an ensemble of outcomes - but his book moved away from that.

This is the reason I hold to the ignorance ensemble interpretation with decoherence - you don't need this unnatural assumption.

Thanks
Bill
 
  • #100
stevendaryl said:
One could assume that a quantum system has definite values for all variables at all times,

You run into problems with Kochen-Specker. The only way to do it is hidden variables.

You can also assume it after decoherence - which is the essence of the ignorance ensemble interpretation with decoherence.

Thanks
Bill
 
  • #101
Jilang said:
This got me thinking... If such a question is not asked in the realm of physics in what realm should it be asked? I would not have thought that the philosophers would have the maths, the mathematicians probably not the inclination...

It can be asked in physics - the problem is exactly how meaningful is it without some experiment to decide on it. Vanhees obviously thinks its not a particularly meaningful thing because of it - but opinions vary. Personally I agree with him - but opinions are like bums - everyone has one - it doesn't make it correct.

There are philosophers around like David Wallice with the necessary background, having both a Phd in physics and philosophy, to address such issues, and they do. For example see his book the Emergent Multiverse I have a copy of:
http://www.amazon.com/dp/0199546967/?tag=pfamazon01-20

Of course that is the exception rather than the rule - to be blunt many philosophers comments about QM leave a lot to be desired.

Thanks
Bill
 
Last edited:
  • #102
Jilang said:
I didn't realize there was any mystery about Born's postulate. Isn't it just the joint probability of something coming one way meeting something coming the other way in imaginary time?

I don't know what you mean by this.

There is no controversy about it per-se - its part of the formalism and just about all physicists/mathematicians accept it.

The issue is just how much does it depend on the other assumptions. We have Gleason's theorem and its variants that actually derive it. If there was no other assumption involved hidden variable theories would be kaput. But careful analysis shows there is an assumption - non contextuality - ie the probability doesn't depend on the basis. That's an almost trivial requirement mathematically in a theory with vector spaces - but physically its not quite so clear.

Thanks
Bill
 
  • #103
Maui said:
I guess it's meant to be that way with all interpretations - you must decide which confusion is less confusing for the worldview you hold.

Even that might be too strong of a commitment to an interpretation. I find myself sometimes choosing an interpretation that "works" for the problem at hand, and dropping it just as quickly when another problem comes along.
 
  • #104
Nugatory said:
Even that might be too strong of a commitment to an interpretation. I find myself sometimes choosing an interpretation that "works" for the problem at hand, and dropping it just as quickly when another problem comes along.


I deleted the original comment as I intended to write a more detailed post(so as not to be misunderstood) but have to attend to other things in the meantime and will get back to it.
 
  • #105
Mathematically what causes the collapse Is the application of a boundary condition in time. Prior to that you have an equation with lots of solutions.
 

Similar threads

Back
Top