Is Heisenberg uncertainty principle a problem of our measuring techniques?

In summary, the experiment proves that the uncertainty principle is not just a result of our flawed measuring techniques.
  • #36
Demystifier said:
I haven't seen the debate on this forum, but I can present a simple reason why the Ballentine's experiment is NOT a measurement of momentum p_y. The point is that in this experiment p_y is NOT MEASURED but CALCULATED. Measurement and calculation are not the same. A calculation always contains an additional theoretical assumption which a true measurement does not need to use.

More specifically, p_y is calculated as
p_y = p sin theta
and this equation is correct only if one ASSUMES that the particle has been moving along a definite straight trajectory and with a constant momentum before it hitted the detector (at position y = L tg theta). However, since such a trajectory has not been measured, there is no theory-independent justification for such an assumption.

If, on the other hand, one accepts the Ballentine's experiment as a valid measurement of momentum, then it is equivalent to an acceptance of the idea that particles may have trajectories even when they are not measured. That's fine, as long as one is aware that we do not have a direct experimental confirmation that this idea is correct. Indeed, the trajectory tacitly assumed in the Ballentine's experiment exactly coincides with the Bohmian trajectory. It is well known that Bohmian trajectories are consistent with QM, and yet that a direct experimental verification of their reality does not exist.

So, loosely speaking, one could say that the Ballentine's measurement of p_y assumes that the Bohmian interpretation is right, even if he is not aware of it.

Is your objection in the same spirit as Bell's comment https://www.physicsforums.com/showpost.php?p=3428572&postcount=89, ie. a measurement should be able to measure based on arbitrary states, not only a special class of states (ie. if we know the state already, then we can measure everything without measurement, so it should be prohibited from having any knowledge of the state for a procedure to be called measurement)?
 
Physics news on Phys.org
  • #37
atyy said:
Is your objection in the same spirit as Bell's comment https://www.physicsforums.com/showpost.php?p=3428572&postcount=89, ie. a measurement should be able to measure based on arbitrary states, not only a special class of states (ie. if we know the state already, then we can measure everything without measurement, so it should be prohibited from having any knowledge of the state for a procedure to be called measurement)?
No, it's unrelated.
 
  • #38
Fredrik said:
So let's call QM without this measuring technique QM1, and QM with it QM2. The predictions made by QM1 is a proper subset of the predictions made by QM2. QM2 simply makes more (testable) predictions than QM1.
Can QM2 be falsified (in the Popper sense) without QM1 being falsified? In other words, what result of an experiment would make you conclude that QM2 is wrong while QM1 is still potentially right?
 
  • #39
Demystifier said:
Can QM2 be falsified (in the Popper sense) without QM1 being falsified? In other words, what result of an experiment would make you conclude that QM2 is wrong while QM1 is still potentially right?
If we perform the experiment that Ballentine describes, and find that the distribution of results isn't given by [itex]p_y\mapsto|\langle p_y|\psi\rangle|^2[/itex], then I would say that QM2 has been falsified.

If there's a solid argument that shows that this would somehow falsify QM1 as well, then I would say that QM=QM2 is definitely the appropriate definition, regardless of what we find in the actual experiment. If there's no such argument, then I would define QM=QM1 if the experiment falsifies QM2, and QM=QM2 if it doesn't.
 
  • #40
Fredrik said:
If we perform the experiment that Ballentine describes, and find that the distribution of results isn't given by [itex]p_y\mapsto|\langle p_y|\psi\rangle|^2[/itex], then I would say that QM2 has been falsified.
Fine!

I haven't worked out the calculation in detail, but it is rather straightforward to see that in QM2 the probability of p_y is NOT [itex]|\langle p_y|\psi\rangle|^2[/itex]. Namely, in QM2 p_y is determined by the position y, so the probability of p_y is related to the probability of y, which is [itex]|\langle y|\psi\rangle|^2[/itex]. In other words, the probability amplitude of p_y in QM2 is related to [itex]\langle y|\psi\rangle[/itex] and not to its Fourier transform [itex]\langle p_y|\psi\rangle[/itex]. Thus, QM2 is already falsified, as it is already experimentally tested that the probability of y is given by [itex]|\langle y|\psi\rangle|^2[/itex].
 
  • #41
Demystifier said:
I haven't worked out the calculation in detail, but it is rather straightforward to see that in QM2 the probability of p_y is NOT [itex]|\langle p_y|\psi\rangle|^2[/itex]. Namely, in QM2 p_y is determined by the position y, so the probability of p_y is related to the probability of y, which is [itex]|\langle y|\psi\rangle|^2[/itex]. In other words, the probability amplitude of p_y in QM2 is related to [itex]\langle y|\psi\rangle[/itex] and not to its Fourier transform [itex]\langle p_y|\psi\rangle[/itex]. Thus, QM2 is already falsified, as it is already experimentally tested that the probability of y is given by [itex]|\langle y|\psi\rangle|^2[/itex].

Actually, this was what I had in mind as a simple proof. The specific counter example that Ballentine is actually making an accurate measurement of a momentum at a different time just makes sure there's no loophole for this argument.

I don't understand why you think the calculation objection isn't the same as the requirement that a measurement give a result for an arbitrary state. In an extreme example for the latter, knowing the state allows you to calculate everything, but those calculations are clearly not measurements.
 
  • #42
Demystifier said:
Fine!

I haven't worked out the calculation in detail, but it is rather straightforward to see that in QM2 the probability of p_y is NOT [itex]|\langle p_y|\psi\rangle|^2[/itex]. Namely, in QM2 p_y is determined by the position y, so the probability of p_y is related to the probability of y, which is [itex]|\langle y|\psi\rangle|^2[/itex]. In other words, the probability amplitude of p_y in QM2 is related to [itex]\langle y|\psi\rangle[/itex] and not to its Fourier transform [itex]\langle p_y|\psi\rangle[/itex]. Thus, QM2 is already falsified, as it is already experimentally tested that the probability of y is given by [itex]|\langle y|\psi\rangle|^2[/itex].
That makes a lot of sense. I might just have to do those calculations. Maybe it still works in an appropriate limit, either L→∞ or L→0.

(L = distance from the slit to the photographic plate/wall of detectors/whatever).
 
  • #44
I definitely get your point, but I still would like to add this subtle note:

Inferences makes use of inference rules, priors, new information (input) to produce an output(a posterior), quite like a "calculation".
So even merging RAW data from a detector requires equivalents of "calculation" as it relies on priors and inference rules.
Demystifier said:
The point is that in this experiment p_y is NOT MEASURED but CALCULATED. Measurement and calculation are not the same. A calculation always contains an additional theoretical assumption which a true measurement does not need to use.
This leads me to question the notion of "true measurents". Are there any true measurements that does note rely on prior "assumptions" (read prior state + current state of inferenece rules). I don't think so, becaause even the simple act of ENCODING, PROCESSING and STORING the raw sequencial information about detector hits requires processing that is on par with "calculations".

I'm arguing for a much more extreme version of what is hinted by others... for example in smolins principle of relativity paper(http://arxiv.org/abs/1101.0931) he expresses the idea thta there is no direct observation of spacetime events, they say "The idea that we live in a spacetime is constructed by inference from our measurements of momenta and energy"

I think what they hint, should be taken to more extremes. The question then is thta it seems that some "calculations" (inference rules) are favoured by nature, while some aren't. Thea idea would then be that there is a different betwen a random ad hoc calculations containing wild and irrational assumptions, and the kind of "rational inference" that you would expect for example from a "rational player" that recalculates his odds according to probability theory. Here his "calculation" is uniquely determined by the same conditions that singles out probability as a unique system for manipulating degres of beleifs in a rational way - in the generalizeds sense he then "measures" the odds!
http://arxiv.org/abs/0908.3212
http://bayes.wustl.edu/etj/reviews/review.cox.pdf
Ariels idea is that the laws of physics - corresponding to the "deductive machinery" we use for "calculations" and that also nature ITSELF uses for interactions follow from uinque rules of rational inference.

Thus my take on the example of this was - if we take the measuremnt of y and p_y to be a form of generla inference (rather than specific QM "measurement" on a well define regular STATE), what is the correct inference to use (calculcation) and what is the generalization of STATE we need when we are mixing information from a complex apparatous rather than a pointlike detector?

My suggestion was that in the state represented by combining information and inference in the way in ballentines example should include the L tan theta as uncertainty of y - if we do, HUP still holds if we also add that the uncertainty of the position of the particle hit is never more accurate than say the order of it's broglie wavelenght no matter if the actual sensor small is pointlike.

So if we require that the "calculation" follows some kind of rational inference, then I think it would also qualify as a generalized measurement (not measurmeent in the sense of obvserving a detector hit, but measurement as "infering an information state")

Then I don't think one needs to assume that the particle follows a certain path between slit and screen, because the information just tells us that we have initial and final points, and all we can infer is an expectation of some "average path", but then also average position of y during hte process.

This isn't formal arguments and I think that to make it fully formal one would need to work out this generalized formalism. But intuitively I think the concept of generalized measurements makes sense and if QM can't handle it, then I think we need to improve it.

/Fredrik
 
  • #45
Fra, I agree with you that all measurements involve some additional assumptions and even calculations. However, a "true" measurement should give a useful information that can be used for predictions on later measurements, as I explained in post #32.
 
  • #46
I agree predictivity is important. I'm not prepare to be very explicit but long time ago I noted to you that there are some similarities between "hidden variables" and the subjective inferences in my view, because the subjective views does rule the local action but since they are not objective - they are "hidden" in the obivous sense. Although I know we probably represent opposite extreems in our visions.

My idea is that the SPECIFIC calculation (specific inference rules in my language) does lead to predictions in terms of how the system encoding the inference rules behaves from the perspective of another observer. In clear it means that the classical input, such as hamiltonians are EXPLAINED in terms of rational inference. But the specific hamiltonian itself encodes the system "inference rules" and defines how it responds to the environment.

But the entire picture here quite involved and has several problems. But a key to make progress in this view is to understand these "generalized inferences". Actuall the notion of "prediction" is generalized as well because the way to falsify this is not in the simple sense as failing a single predictions, the falsificiation requires that the observer completely FAILS to learn from input - this means his rational inference is unfit. The same mechanis is the idea why nature itself (such as atoms and all of SM physics) is "tuned" for certain specific rules if inference.

I don't think it's possible to explain this using the simple
if inference(i) <> observaton(i) => inference is false; because the interesting things is that a good inference is selftuning and will converge to correct observations when it has thrived on sufficient amount of data - ie it requires adapation.

In my view, only failing to adapt means "falsified". beeing wrong in single predictions is all part of the interaction.

I know this is opposite thinking from BM, but the common factor is "hidden logic". In my view the point is that even the "space of logic" is not konwn, therefor any attempt or anzats to assume a probability distribution of hte set of inference rules will fail. It's just not how it works. Yet the hidden inference rules is important.

Edit: I guess one cold also say that the "local decision problem" is exactly what is hidden; yet this is exactly what explains the interactions.

/Fredrik
 
Back
Top