Is the wave function real or abstract statistics?

In summary: They use logic and mathematics to show that the wave function is in one-to-one correspondence with its "elements of reality."
  • #106
kye said:
does this violate special relativity or it doesn't in the same way quantum entanglement can't be used to send information faster than light.. so the wave function if it is really there can't be used to send information ftl...

It doesn't violate special relativity. No classical information can be sent faster than the speed of light. The link I gave in #104 helps to show that. You can also read http://arxiv.org/abs/quant-ph/0212023

kye said:
one problem of an actual wave function without Bohmian's formalism but Heisenberg's is the problem of how does the particle in the double slit transform to wave at emission and before reaching slits, how does it know whether to change back to wave (as if predicting the slits in front). But then by not using the concept of particles and waves and wavecles perhaps who knows wavecles have really this ability, so what line of arguments do you have that shows this possibility to be untenable?

There's only the wave function, and it makes predictions consistent with all observations so far.
 
Physics news on Phys.org
  • #107
kye said:
What experiments have been done to determine the speed of collapse in interpretation with collapse? For example, in double slit, when one side of slit screen detector detects the particle, the entire wave function collapse, so does the collapse travel at speed of light or instantaneous between the detectors in both slits? What experiments have been done akin to this to determine if it's instantaneous or travel at speed of light? Or can no experiment be done to determine it, why?

Valid question. The closest experiments people did were in my opinion the ones putting bounds on 'spooky action at a distance'. Basically these guys assumed spooky action at a distance exists and then checked what its minimal velocity must be by doing long distance Bell tests. Of course the people doing those tests are not proponents of spooky action at a distance.

See Nature 454, 861-864 (2008)
http://www.nature.com/nature/journal/v454/n7206/full/nature07121.html

Phys. Rev. Lett. 110, 260407 (2013)
http://prl.aps.org/abstract/PRL/v110/i26/e260407
 
  • #108
matrixrising said:
Cthuga quotes:

"In such a case, a measurement performed on a single system does not yield the value of the shift (the element of reality), but such measurements performed on large enough ensemble of identical systems yield the shift with any desirable precision." (Foundations of Physics 26, 895 (1996)).

If we send a single photon through the double slit -- its final location the screen can tell us whether which-way was done or not.

If the photon ends up on one of the (theoretically calculated...schrodinger's equation(?)...) fringes then no-which-way was done.

If the photon ends up right is front of one of the slits then which-way was performed

Thus a single "run" of a single photon can provide information.

Trying to understand the above quote
 
  • #109
Atyy you said,

"Indeed, it is impossible to determine a completely unknown wavefunction of single system [20]."

What experiment is this? nobody tried to determine a completely unknown wave function of a single system. Here's the 1st step of the Lundeen experiment.

How the experiment works:Apparatus for measuring the wavefunction

1. Produce a collection of photons possessing identical spatial wavefunctions by passing photons through an optical fiber.

So the wave function of a single system wasn't unknown. It was known before the weak and strong measurements.

http://www.photonicquantum.info/Research.html

It's like Lebron and a 82 game basketball season. You look at the average PPG per season and then you can go back and see if the average is proportional to single games. Lundeen directly measured the wave function of a single particle. Like I said, Bohm2 gave a good example.

To understand what weak measurement is, the following analogy from everyday life is useful. Assume that you want to measure the weight of a sheet of paper. But the problem is that your measurement apparatus (weighing scale) is not precise enough to measure the weight of such a light object such as a sheet of paper. In this sense, the measurement of a single sheet of paper is - weak.

Now you do a trick. Instead of weighing one sheet of paper, you weigh a thousand of them, which is heavy enough to see the result of weighing. Then you divide this result by 1000, and get a number which you call - weak value. Clearly, this "weak value" is nothing but the average weight of your set of thousand sheets of papers.

But still, you want to know the weight of a SINGLE sheet of paper. So does that average value helps? Well, it depends:

1) If all sheets of papers have the same weight, then the average weight is equal to weight of the single sheet, in which case you have also measured the true weight of the sheet.

2) If the sheets have only approximately equal weights, then you can say that you have at least approximately measured the weight of a single sheet.

3) But if the weights of different sheets are not even approximately equal, then you have not done anything - you still don't have a clue what is the weight of a single sheet.

How can you say the individual results are meaningless to the average when the average is proportional to the wave function of a single particle? It's like saying Lebron's individual games are meaningless to the average PPG per season. This is why ensemble interpretations got 3% from prominent Physicist. It's a joke.

Right interpretation of state vectors:

27%: epistemic/informational
24%: ontic
33%: a mix of epistemic and ontic
3%: purely statistical as in ensemble interpretation
12%: other

I chose not to label the "ensemble interpretation" as correct because the ensemble interpretation makes the claim that only the statistics of the huge repetition of the very same experiment may be predicted by quantum mechanics. This is a very "restricted" or "modest" claim about the powers of quantum mechanics and this modesty is actually wrong. Even if I make 1 million completely different experiments, quantum physics may predict things with a great accuracy.

Imagine that you have 1 million different unstable nuclei (OK, I know that there are not this many isotopes: think about molecules if it's a problem for you) with the lifetime of 10 seconds (for each of them). You observe them for 1 second. Quantum mechanics predicts that 905,000 plus minus 1,000 or so nuclei will remain undecayed (it's not exactly 900,000 because the decrease is exponential, not linear). The relatively small error margin is possible despite the fact that no pair of the nuclei consisted of the same species!

So it's just wrong to say that you need to repeat exactly the same experiment many times. If you want to construct a "nearly certain" proposition – e.g. the proposition that the number of undecayed nuclei in the experiment above is between 900,000 and 910,000 – you may combine the probabilistically known propositions in many creative ways. That's why one shouldn't reduce the probabilistic knowledge just to some particular non-probabilistic one. You could think it's a "safe thing to do". However, you implicitly make statements that quantum mechanics can't achieve certain things – even though it can.

http://www.technologyreview.com/view/509691/poll-reveals-quantum-physicists-disagreement-about-the-nature-of-reality/

These interpretations are what I call stick your head in the sand and deny everything interpretations. They make zero sense in light of experiment after experiment.
 
  • #110
matrixrising said:
So the wave function of a single system wasn't unknown. It was known before the weak and strong measurements.

If it was known before the measurement, why did they have to measure it?

Also, the authors themselves say "In contrast, we introduce a method to measure ψ of an ensemble directly." http://arxiv.org/abs/1112.3575
 
  • #111
Great point San K.

Also the quote was before Lundeen(2011). Here's the quote:

"In such a case, a measurement performed on a single system does not yield the value of the shift (the element of reality), but such measurements performed on large enough ensemble of identical systems yield the shift with any desirable precision." (Foundations of Physics 26, 895 (1996)).

Again, it's like Lebron's average of PPG per season. The average is proportional to single games. In this case, Lundeen used a stream of single photons WITH THE SAME WAVE FUNCTION. So the average was proportional to the value of the shift in a single system.

The quote doesn't say a single system doesn't yield any shift. This goes back to what Lundeen said:

"Indeed, it is impossible to determine a completely unknown wavefunction of single system [20]."

So the average will not be proportional to single systems with unknown wave functions because of uncertainty. In the case of Lundeen, the wave function of a single system is known before the weak measurement.

So it's like Lebron and his average PPG. If the individual games are unknown then you couldn't go back and see if the individual games are proportional to his seasons average. The reason we're having this debate because most results of experiments have to be labeled meaningless because the small percentage of those who espouse an ensemble interpretation, simply deny everything.
 
  • #112
matrixrising said:
The reason we're having this debate because most results of experiments have to be labeled meaningless because the small percentage of those who espouse an ensemble interpretation, simply deny everything.

I believe the point most people are making in response to your remarks is that the experiments do not rule out an ensemble interpretation. No one is saying that you cannot use an interpretation in which the wave function describes single systems. People are saying the data are consistent with either.
 
  • #113
San K said:
Trying to understand the above quote

We were explicitly discussing weak measurements here.

matrixrising said:
Again, it's like Lebron's average of PPG per season. The average is proportional to single games. In this case, Lundeen used a stream of single photons WITH THE SAME WAVE FUNCTION. So the average was proportional to the value of the shift in a single system.

What is so difficult to understand about variance?

A single measurement result will give values determined by some distribution. Let us consider the simplifying case of Gaussian ones. If the distribution is centered around the mean value of 10 with a standard deviation of 0.1, the single measurement gives you good and valid information about the mean. If the distribution is centered around 10 with a standard deviation of 150000, this is not the case and you may even get values that are not even sensible or allowed.

The question is, whether you necessarily get large variances in weak measurements and the answer is trivially yes. Weak values are defined as:
[tex]a_w=\frac{\langle f|A|i\rangle}{\langle f|i\rangle}[/tex]
where f and i are initial and final (or preselected and postselected) states.
The strength of the measurement is given by the overlap of states f and i, so to get a weak and non-perturbative measurement, you need to get them almost orthogonal. So what you do is dividing by almost 0. This leads to the large variances and noisy distributions.

It also intrinsically(!) leads to the possibility of non-physical results for single results. Well known examples are the spin value of 100 and a weakly measured phase shift caused by a single photon which is out of the range of possible phase shifts for single photons (Phys Rev Lett 107, 133603 (2011)). This is not a question of which experiment to perform, but intrinsically present in any weak measurement.

matrixrising said:
So the average will not be proportional to single systems with unknown wave functions because of uncertainty. In the case of Lundeen, the wave function of a single system is known before the weak measurement.

The wave function was known beforehand? The wave function is the result of his experiment. He measured it. It was known what to expect, though, if that is what you mean: a standard mode prepared by a fiber. I also do not get why you think Lundeen's statement refers to uncertainty.

matrixrising said:
So it's like Lebron and his average PPG. If the individual games are unknown then you couldn't go back and see if the individual games are proportional to his seasons average.

I already told you before that this is not the case and the comparison is invalid. Lebron's case is the one with the small variance. Let's say he does 25 points per game and the results vary between 0 and maybe 45. In that case each of the single results can be associated with an element of reality.

A weak measurement of Lebron's PPG is rather working as follows: He has a PPG of 25 and we have a weak measurement sensitive to the deviation from the mean (the shift of the pointer in the experiment). Let us imagine he scored only 20 points in a single game, so the shift is -5 for this single game. We now measure an amplified version of this value. The amplification is given by dividing by the overlap between the initial and final state and this is usually very small and the initial and final states are only prepared to the optimal precision allowed by uncertainty. So the actual amplification also varies randomly from measurement to measurement. In this case it may be 100, so the actual value measured will be 25-(5*100)=-475.

So the weakly measured value of the points he scored in that game is -475. I have a hard time considering that as being a reasonable value for the actual points scored in a game, so it can not be considered an element of reality. The value is not even allowed.

Why does the procedure still work out in the end? You have two combined stochastic processes. The fluctuation of the value around the mean and the fluctuation of the amplification. You need to average over both distributions to get the mean value. If we had just the fluctuations around the mean, I would agree with you: the points scored in a single game are meaningful. However, this is the result of a strong measurement and it would necessarily be invasive. This is the point where weak measurements enter. By adding the second stochastic distribution in terms of the amplification, we keep the property that the mean will still converge to the value we want, but we lose the ability to give meaning to this single result as we would need to know the actual amplification ratio in each run to make that identification.

And yes, this is very simplifying. Please do not take this version too literally.
 
  • #114
bhobba said:
Notice I said APPARENT collapse.

As Von Neumann first proved actual collapse is totally nebulous since it can be placed anywhere. This means, without the constraint of apparent collapse its an unanswerable question.

Thanks
Bill

I'm familiar with the Von Neumann cut where it can be put anywhere.. but what is your definition of "apparent collapse"? why "apparent" when decoherence just produced mixed state.. isn't it collapse is when the born rule is invoked.. meaning everytime born rule is invoked, classical intruments get a measurement of the classical value (or preferred basis).. so what is the meaning of "apparent collapse"? In your view, does decoherence and born rule automatically binded to each other?

You can't do it because you don't know where or when it occurred - this is the Von-Neumann regress that led to that utterly weird idea of consciousness causing collapse.

So in the current Copenhagen Interpretation, the idea of Von Neumann is faced out already.. how about that of Heisenberg potential.. is it also faced out already? what years did it happen for them?
 
  • #115
cthugha,

Sorry, this is misrepresenting Lundeen because of your desire to render the direct measurement of a single particle meaningless. This is why ensemble interpretations barely register among Physicist. You said:

You need to average over both distributions to get the mean value. If we had just the fluctuations around the mean, I would agree with you: the points scored in a single game are meaningful.

First, the Lebron analogy still applies to classical physics. So you can't say, well this doesn't make sense because it would be like Lebron scoring -475 in a game. Apples&Oranges again. This is like asking why we don't see the superposition of a rock. It's because the rock is classical. So you can't say a weak value on a quantum system is a one to one correspondence with a classical score in a national basketball game. The analogy was how the single photons wave function is proportional to the average the same way PPG is proportional to the seasons average.

You then said:

If we had just the fluctuations around the mean, I would agree with you: the points scored in a single game are meaningful. However, this is the result of a strong measurement and it would necessarily be invasive. This is the point where weak measurements enter.

Wrong.

First, of course this is fluctuations around the mean of the real and imaginary parts of the wave function. Here's Lundeen:

The average result of the weak measurement of px is proportional to the wavefunction of the particle at x.
Scanning the weak measurement through x gives the complete wave-
function. At each x, the observed position and momentum shifts of the
measurement pointer are proportional to ReY(x) and ImY(x),
respectively. In short, by reducing the disturbance induced by mea-
suring X and then measuring P normally, we measure the wave-
function of the single particle.

This is the ball game. SCANNING THE WEAK MEASUREMENT THROUGH GIVES YOU A COMPLETE WAVE FUNCTION. AT EACH X(SINGLE PHOTON) THE OBSERVED SHIFTS OF THE MEASUREMENT POINTER ARE PROPORTIONAL TO THE REAL AND IMAGINARY PARTS OF THE WAVE FUNCTION!

The weak value could be 100 but this can easily be explained by the Aharonov–Albert–Vaidman (AAV) effect. In a classical analogy, let's go back to Lebron's PPG average per season. Let's say Lebron averages 25 PPG for the season. You go back and see he had a game against the Spurs where he scored 5 points. This 5 points in this game would not take away his average of 25 PPG. We call this an outlier. Here's more about the Aharonov–Albert–Vaidman (AAV) effect.

The real part of the weak value is the outcome of the standard measurement pro-
cedure at the limit of weak coupling. Unusually large outcomes, such as
spin 100 for a spin− 1 particle [2], appear from peculiar interference effect (sometimes
called Aharonov–Albert–Vaidman (AAV) effect) according to which, the superpo-
sition of the pointer wave functions shifted by small amounts yields similar wave
function shifted by a large amount. The coefficients of the superposition are univer-
sal for a large class of functions for which the Fourier transforms is well localized
around zero.

In the usual cases, the shift is much smaller than the spread Δ of the initial state
of the measurement pointer. But for some variables, e.g., averages of variables of a
large ensemble, for very rare event in which all members of the ensemble happened
to be in the appropriate post-selected states, the shift is of the order, and might be
even larger than the spread of the quantum state of the pointer [5]. In such cases the
weak value is obtained in a single measurement which is not really “weak”.

One can get an intuitive understanding of the AAV effect, noting that the coupling
of the weak measurement procedure does not change significantly the forward and
the backward evolving quantum states. Thus, during the interaction, the measuring
device “feels” both forward and backward evolving quantum states. The tolerance of
the weak measurement procedure to the distortion due to the measurement depends
on the value of the scalar product.

Since the quantum states remain effectively unchanged during the measurement,
several weak measurements can be performed one after another and even simulta-
neously. “Weak-measurement elements of reality”

Emphasis Mine

So this effect is an outlier and not really weak because all members of the selected ensemble happen to be in the appropriate post selected state.
 
  • #116
matrixrising said:
Sorry, this is misrepresenting Lundeen because of your desire to render the direct measurement of a single particle meaningless.

No, I understand well what he did. You are the one claiming the non-standard position here. You still have not shown that Lundeen supports your point in any manner. Also measurements on single particles are not meaningless in the ensemble interpretation. An ensemble of measurements on a single particle can be pretty meaningful. For example, one could do a lot of cool stuff on single particles in Haroche's QND experiments.

matrixrising said:
This is why ensemble interpretations barely register among Physicist.

Well, I also do not follow the ensemble interpretation and am not among these 3%. However, these 3% are by far not the only people you are arguing against. You are arguing at least against the information theory based approaches as well. Against Copenhagen, too. But disliked interpretations are something else than impossible ones, anyway.

matrixrising said:
First, the Lebron analogy still applies to classical physics.

Sure, but there are no weak measurements in classical physics (or only weak ones depending on how you define it). That is trivial.

matrixrising said:
So you can't say, well this doesn't make sense because it would be like Lebron scoring -475 in a game. Apples&Oranges again. This is like asking why we don't see the superposition of a rock. It's because the rock is classical.

We also do not "see" superpositions in quantum things like electrons either. If you directly measured an electron as being in a superposition, you would become pretty famous.

matrixrising said:
So you can't say a weak value on a quantum system is a one to one correspondence with a classical score in a national basketball game. The analogy was how the single photons wave function is proportional to the average the same way PPG is proportional to the seasons average.

The latter IS saying that one has a one to one correspondence between weak values and classical scores. And I agree to the first part. A weak value is not the same as a classical score. This is why your comparison of the two is meaningless.

matrixrising said:
Wrong.
[...]
This is the ball game. SCANNING THE WEAK MEASUREMENT THROUGH GIVES YOU A COMPLETE WAVE FUNCTION. AT EACH X(SINGLE PHOTON) THE OBSERVED SHIFTS OF THE MEASUREMENT POINTER ARE PROPORTIONAL TO THE REAL AND IMAGINARY PARTS OF THE WAVE FUNCTION!

What? What makes you think X is a single photon? You measure over an ensemble of single photons at each x.

matrixrising said:
The weak value could be 100 but this can easily be explained by the Aharonov–Albert–Vaidman (AAV) effect. In a classical analogy, let's go back to Lebron's PPG average per season. Let's say Lebron averages 25 PPG for the season. You go back and see he had a game against the Spurs where he scored 5 points. This 5 points in this game would not take away his average of 25 PPG. We call this an outlier.

No, as I told you earlier: You can only call this an outlier in your sense, if it is within the realm of allowed and sensible values corresponding to elements of reality. 5 is an outlier. -20 is not as it is not even allowed.

matrixrising said:
So this effect is an outlier and not really weak because all members of the selected ensemble happen to be in the appropriate post selected state.

The quote above is correct. Your summary is not. In fact, the effect shows that this remarkable result should not be considered as an outlier and one should not consider weak values as elements of reality (see my earlier quote from Vaidman). Outliers are rare events, but in line with the possible and allowed experimental results. Weak values are not bound to that range. When asking the question, whether one should can consider weak values as real results, one counterexample is enough. I do not see your problem here. I do not deny that weak measurements are useful. I do not deny that the experiment is useful. I just deny that this experiment shows that one unambiguously has to interpret the wave function as real and that single weak values have to interpreted as real entities as a consequence of this experiment. I have given more than enough publications telling you otherwise. So, still: Please tell me, where they are wrong and why one MUST interpret the wave function as real as a consequence of Lundeen's paper instead of presenting an army of strawmen.
 
  • #117
cthugha,

What?

This one quote shows that you can't handle the truth. You said:

What? What makes you think X is a single photon? You measure over an ensemble of single photons at each x.

Of course you didn't quote Lundeen and here's why:

The average result of the weak measurement of px is proportional to the wavefunction of the particle at x.

There's single photons at each x because the wave functions of these single photons are measured at each x. These measurements of a particles wave function is proportional to the average like I have been saying. It goes on to say:

At each x, the observed position and momentum shifts of the
measurement pointer are proportional to ReY(x) and ImY(x),
respectively. In short, by reducing the disturbance induced by mea-
suring X and then measuring P normally, we measure the wave-
function of the single particle.


It's what I've been saying.

I also think we have evidence that subatomic particles must be in a state of physical superposition. It's called Quantum Computers. We couldn't carry out calculations on qubits if the subatomic particle wasn't physically in in different states at the same time. Physical systems store bits. Whether it's a human brain, computer, event horizon of a black hole or a subatomic particle.

I look at it like a building. When you're in front of the building, you can't see the sides or back of the building. So prior to measurement these real states are in superposition. So this quantum building is in constant motion. Unlike a real building, a classical observer doesn't know if they're getting out the car in the front, side or back of the building.

So the quantum building(wave function) must physically store these pure states(qubits).

Lundeen has just given us more confirmation that this quantum building(wave function) exist by directly measuring the wave function of a single particle.
 
  • #118
The 3% you keep quoting is from a selection of physicists. Don't take it as being an accurate representation of the views of all quantum physicists until that survey has been done.
 
  • #119
The moderators have decided this thread is not progressing and will remained closed.
 
Last edited:
  • Like
Likes 1 person

Similar threads

Replies
1
Views
1K
Replies
61
Views
3K
Replies
16
Views
2K
Replies
1
Views
878
Replies
36
Views
4K
Replies
24
Views
2K
Replies
3
Views
974
Back
Top