Nick Herbert's Proof: Quantum Non-Locality Explained

  • Thread starter harrylin
  • Start date
  • Tags
    Proof
In summary: One general issue raised by the debates over locality is to understand the connection between stochastic independence (probabilities multiply) and genuine physical independence (no mutual influence). It is the latter that is at issue in “locality,” butit is the former that goes proxy for it in the Bell-like calculations.The argument presented in the linked article seems convincing.
  • #36
harrylin said:
It now appears to me that in reality he proved that idealised quantum theory (but perhaps not even realistic quantum theory) makes surprising predictions...
The QM predicted correlation isn't surprising if one takes into account the known behavior of light. Herbert's conclusion that the correlation between θ and rate of coincidental detection should be linear goes against what's known about light. So, it would seem that Herbert's conclusion is the more surprising one.
 
Physics news on Phys.org
  • #37
gill1109 said:
I think that Bell's (and Herbert's) arguments show that nature is non-classical. It is very definitely non-deterministic.
I think you're reading too much into it. We know that LR models give incorrect predictions. What is it in the models that causes their predictions to diverge from QM and experimental results? When this is ascertained, then the question is whether this informs wrt deep reality -- which, imo, it doesn't.

gill1109 said:
And this randomness is at the heart of quantum mechanics, therefore at the heart of chemistry, therefore at the heart of life; also because it's at the heart of quantum mechanics, it's at the heart of cosmology, at the heart of the existence of the universe as we know it.

I find that a rather exciting thought.
I find it rather an unwarranted stretch. Randomness refers to unpredictable (experimental) phenomena.
 
  • #38
harrylin said:
Realism is definitely assumed in simulations that make use of detection time windows; and surely the detection times at A are not affected by the detection times at B. Such simulations demonstrate that there has to be a glitch in this nice looking proof by Herbert... and as you and gill say, it's similar with Mermin's example as well as with Bell's calculation. There is thus a glitch in all these "proofs" that I just don't get... :rolleyes:

The glitch in all these "proofs" are that they accept non-locality, which has no
known mechanism, and is bizarre. For example from Herbert :
No local reality can explain these facts. ( yet )
Therefore reality is non- local
 
  • #39
One thing I've failed to get clear is why locality meant additive (25+25 and 30+30) while non-locality was "proven" by the sinusidol wave results. Given the results, anything local must explain why the results are sinusidal. Yet in the proof, the callibration, which is purely local, does indeed produce a sinusoidal wave result. Which brings me back round to the question of why the local expectation is additive and not sinosuidal, for the non-polarised light during the experiment.

I may have spelt sinosuidal corrently once out of the five times I used it...
 
  • #40
When I said randomness I did not refer to unpredictable (experimental) phenomena. When you toss a coin, the result depends deterministically on the initial conditions. That is familiar everyday randomness which is merely practical unpredictability.

QM on the other hand says that nature is intrinsically random. There is no hidden layer "explaining" what actually will happen. The randomness is spontaneous. Inexplicable. Without antecedent. Effects without a cause.
 
  • #41
morrobay said:
The glitch in all these "proofs" are that they accept non-locality, which has no known mechanism, and is bizarre. For example from Herbert :
No local reality can explain these facts. ( yet )
Therefore reality is non- local
Actually, zonde pointed out in post #26 that Herbert had the facts wrong. At best there are only a few published experiments that "closed the detection loophole", and I suspect that they were done with different set-ups than the one on which he based his proof.

However, you do put your finger on another weak point in the proof. As you say, non-locality has no known mechanism and as far as I know, no non-local model exists that could explain these non-facts. Which shows that the reasoning [no local reality can explain these facts, THEREFORE reality is non- local], is flawed. And this was to be expected: flawed reasoning is typical for paradoxes.
 
Last edited:
  • #42
salvestrom said:
One thing I've failed to get clear is why locality meant additive (25+25 and 30+30) while non-locality was "proven" by the sinusidol wave results. Given the results, anything local must explain why the results are sinusidal. Yet in the proof, the callibration, which is purely local, does indeed produce a sinusoidal wave result. Which brings me back round to the question of why the local expectation is additive and not sinosuidal, for the non-polarised light during the experiment.

I may have spelt sinosuidal corrently once out of the five times I used it...
It sounds to me as if you misunderstand Herbert's argument, and perhaps also what is meant with "localist". Please have a look at my post #20, in which I elaborated on Herbert's proof. Do you see an error, assuming that step 1 is correct?
 
  • #43
Harrylin, that's the whole point, that quantum non-locality (violation of Bell etc) has no known (local) mechanism.

There are non-local models a-plenty which reproduce the predictions of quantum mechanics, for instance Bohmian models.

On the other hand, there are *no* published experiments which closed the detection loophole while at the same time having the measurement completed in each wing of the experiment, before the chosen setting in the other could have become known. Every published experiment to date does allow a local realistic explanation. But none of them are plausible. If messages can be sent faster than the speed of light in order to engineer the singlet correlations, why does nature not also use this to create action at a distance? The hidden layer where instantenous communication takes place is still mysteriously insulated from the "real world". (QM does not allow instant messaging, for instance). And is it really plausible that the physical mechanism of tossing a coin to choose a setting on one apparatus is linked to the physical mechanism of the measured polarization of a photon far away?
 
  • #44
gill1109 said:
Harrylin, that's the whole point, that quantum non-locality (violation of Bell etc) has no known (local) mechanism.

There are non-local models a-plenty which reproduce the predictions of quantum mechanics, for instance Bohmian models.

On the other hand, there are *no* published experiments which closed the detection loophole while at the same time having the measurement completed in each wing of the experiment, before the chosen setting in the other could have become known. Every published experiment to date does allow a local realistic explanation. But none of them are plausible. If messages can be sent faster than the speed of light in order to engineer the singlet correlations, why does nature not also use this to create action at a distance? The hidden layer where instantenous communication takes place is still mysteriously insulated from the "real world". (QM does not allow instant messaging, for instance). And is it really plausible that the physical mechanism of tossing a coin to choose a setting on one apparatus is linked to the physical mechanism of the measured polarization of a photon far away?
Thanks for reminding me of Bohmian models!
To me all those explanations ("localist" as well as "non-localist") seem implausible; only the hypothesis of a "non-localist" cause seems much more implausible to me than that of a "localist" cause. Of course, such estimations are very personal. For example, if no plausible explanation could be found for this trick , then some people may find magic (that is, the usage of unknown laws of physics) the most plausible, while I will keep on looking for a more down-to-earth explanation. Maybe I'm too stubborn? :rolleyes:

Note that I don't think that nature is playing tricks on us; it's more us playing tricks on ourselves, due to misinterpretation of what we see.
 
  • #45
harrylin, this point was already made by gill1109 and DrChinese, but just to reiterate the place where Herbert invokes counterfactual definiteness AKA realism, is when he says that the probability of mismatch at -30 and 30 is less than or equal to the probability of mismatch at -30 and 0 plus the probability at 0 and 30. So Herbert is assuming it makes sense to ask what *would have* occurred if you oriented one the SPOT detectors at 0 degree angle, even when the detectors are actually oriented at -30 degrees and 30 degrees. So the assumption is that regardless of what measurements you actually do, there are still well-defined answers (although unknown to the experimenters) for the results of measurements you did not do.

That's why quantum mechanics itself does not fall victim to Bell's theorem: because it's not realistic. If you measure the position of a particle, in QM it doesn't make sense to ask what result you would have gotten if you had instead measured momentum.
 
  • #46
lugita15 said:
That's why quantum mechanics itself does not fall victim to Bell's theorem: because it's not realistic. If you measure the position of a particle, in QM it doesn't make sense to ask what result you would have gotten if you had instead measured momentum.
I agree with you on this (finally, eh? :smile:).
 
  • #47
gill1109 said:
When I said randomness I did not refer to unpredictable (experimental) phenomena. When you toss a coin, the result depends deterministically on the initial conditions. That is familiar everyday randomness which is merely practical unpredictability.

QM on the other hand says that nature is intrinsically random. There is no hidden layer "explaining" what actually will happen. The randomness is spontaneous. Inexplicable. Without antecedent. Effects without a cause.
The words spontaneity and randomness refer to our ignorance of, and inability to specify the mechanics of, an assumed (if only tacitly) local deterministic evolution of a system from a prior state.

If QM isn't a realistic theory, then it can't be saying much, if anything, about deep reality. To paraphrase a statement by Bohm from his 1950 textbook, maybe a more appropriate name for the theory would be quantum nonmechanics.

It's interesting to me that LR models of individual detection are compatible with QM. That is, in the case of individual detection, in the words of J. S. Bell:
So in this simple case there is no difficulty in the view that the result of every measurement is determined by the value of an extra variable, and that the statistical features of quantum mechanics arise because the value of this variable is unknown in individual instances.

So, why is it that the joint (entanglement) observational context is impossible to viably describe in the same LR terms that, wrt individual measurements, are compatible with QM?

The most parsimonious working hypothesis would seem to me to be that there's something about the encoding of the standard LR modelling requirements that is at odds with the experimental design of Bell tests. If so, then BI violations wouldn't be informing wrt deep reality -- while still definitively ruling out a certain class of LR models of quantum entanglement.

Ascertaining the precise source of the assumed discrepancy has been the subject of much debate. I have my own ideas on it, but they're not rigorously developed, certainly not definitive, and the possibility remains that nature might be nonlocal. But, while that's a possibility, I don't think it's the best working hypothesis. So, with many others, I continue to assume that our universe is evolving deterministically in accordance with the principle of locality.

Herbert's line of reasoning, which fails to take into account the known behavior of light (wrt crossed polarizers), certainly doesn't rule out those assumptions.
 
Last edited:
  • #48
lugita15 said:
[..] just to reiterate the place where Herbert invokes counterfactual definiteness AKA realism, is when he says that the probability of mismatch at -30 and 30 is less than or equal to the probability of mismatch at -30 and 0 plus the probability at 0 and 30.
What I found so great about Herbert's proof, is that it doesn't invoke lambda's and even not probabilities, but just direct comparisons of statistical measurement data. That is of course strongly related to probabilities, but it's great not to have to make that step.
So Herbert is assuming it makes sense to ask what *would have* occurred if you oriented one the SPOT detectors at 0 degree angle, even when the detectors are actually oriented at -30 degrees and 30 degrees. So the assumption is that regardless of what measurements you actually do, there are still well-defined answers (although unknown to the experimenters) for the results of measurements you did not do. [...]
More or less so. Herbert simply uses the "fact" that when "aligning both SPOT detectors, No errors are observed." Thus his assumption is that detector settings do not affect whatever is sent towards the detectors, if that is what you mean

Also, the successful "local realist" models that I referred to in my first post, no doubt do allow for well-defined answers (although unknown to the experimenters) for the results of measurements you did not do.
 
Last edited:
  • #49
ThomasT said:
[..] Herbert's line of reasoning, which fails to take into account the known behavior of light (wrt crossed polarizers), certainly doesn't rule out those assumptions.
Please elaborate - you seem to suggest to have spotted another flaw in Herbert's proof, but it's not clear to me what you mean.
 
  • #50
harrylin said:
More or less so. Herbert simply uses the "fact" that when "aligning both SPOT detectors, No errors are observed." Thus his assumption is that detector settings do not affect whatever is sent towards the detectors, if that is what you mean
No, that's not what I meant, but that's also an important assumption, known as the "no-conspiracy condition". There are people known as superdeterminists who try to get around Bell's theorem by violating this condition, e.g. by saying that the particles know in advance what the detector settings will be because the universe is totally deterministic, so the two particles use this information to coordinate in just the right way so that Bell's inequality appears to be false even though it would really be true if measurement decisions were free and independent. Superdeterminism is a pretty small fringe, but it counts Nobel laureate Gerard t'Hooft as one of it's adherents.

Anyway, what I was talking about was when Hebert says this "Starting with two completely identical binary messages, if A's 30 degree turn introduces a 25% mismatch and B's 30 degree turn introduces a 25% mismatch, then the total mismatch (when both are turned) can be at most 50%."He's assuming whenever you get a mismatch between the -30 degree polarizer and the 30 degree polarizer, this really represents a deviation of one of the polarizer measurements from the "identical binary messages" that would have been gotten if you had put both polarizers at 0 degrees. Without the assumption that there is counterfactual definiteness at 0 degrees, you can't conclude that the percentage (i.e. the probability) of mismatches at -30 and 30 is less than or equal to the percentage of mismatches at -30 and 0 plus the percentage of mismatches at 0 and 30.
harrylin said:
Also, the successful "local realist" models that I referred to in my first post, no doubt do allow for well-defined answers (although unknown to the experimenters) for the results of measurements you did not do.
The "successful" local hidden variable models you're talking about, like the ones zonde was referring to, do not actually reproduce all the experimental predictions of quantum mechanics. Rather, they exploit some loophole or the other of Bell test experiments to say that Bell tests experiments to date have not definitively disproven their particular theories, but they claim that an "ideal experiment" would prove them right and QM wrong.

Remember, all Bell's theorem shows is that a local hidden variable theory cannot reproduce all the experimental predictions of QM. It says nothing at all about theories which claim that some of the predictions of QM are wrong and can in principle be disproven experimentally.
 
  • #51
"So Herbert is assuming it makes sense to ask what *would have* occurred if you oriented one the SPOT detectors at 0 degree angle, even when the detectors are actually oriented at -30 degrees and 30 degrees. So the assumption is that regardless of what measurements you actually do, there are still well-defined answers (although unknown to the experimenters) for the results of measurements you did not do."

That is it, spot on. That is what people call "realism". After that, the notion of "locality" is applied to those counterfactual outcomes of the non-performed measurements.
 
  • #52
Nobel laureate Gerard t'Hooft (who I have talked to about this a number of times) is a superdeterminist when we are talking about the quantum world and what might be below or behind it at even smaller scales; what he apparently can't realize is that Bell's argument applies to objects in the macroscopic world, or supposed macroscopic world - actual detector clicks and the clicks which the detectors would have made if they had been aligned differently.
 
  • #53
gill1109 said:
Nobel laureate Gerard t'Hooft (who I have talked to about this a number of times) is a superdeterminist when we are talking about the quantum world and what might be below or behind it at even smaller scales; what he apparently can't realize is that Bell's argument applies to objects in the macroscopic world, or supposed macroscopic world - actual detector clicks and the clicks which the detectors would have made if they had been aligned differently.

That is what I really don't "get" about t'Hooft's position. That there are essentially an infinite number of possible macroscopic "decision machines" that could be used to select detector alignment, and all of them must be "in" on the conspiracy.

For example, my aunt Miriam could make the decisions for one of the detectors, while the other is controlled by a computer which gets apparently random seeds from a geiger counter near a radioactive sample. And yet he is saying these are not only predetermined, but acting in a coordinated manner.

Now if you knew my aunt Miriam, you would know how ridiculous this actually sounds. :smile: At any rate, it certainly implies an internal physical structure far beyond anything previously discovered. I would estimate that every particle must have some kind of local superdeterministic DNA to account for my Aunt Miriam and the radioactive sample. As well as for any other pairs of macroscopic selection devices, of which there would be many.
 
  • #54
lugita15 said:
No, that's not what I meant, but that's also an important assumption, known as the "no-conspiracy condition".
[..]
Anyway, what I was talking about was when Hebert says this "Starting with two completely identical binary messages, if A's 30 degree turn introduces a 25% mismatch and B's 30 degree turn introduces a 25% mismatch, then the total mismatch (when both are turned) can be at most 50%."He's assuming whenever you get a mismatch between the -30 degree polarizer and the 30 degree polarizer, this really represents a deviation of one of the polarizer measurements from the "identical binary messages" that would have been gotten if you had put both polarizers at 0 degrees.
That's the subtle detail that I disagree with: he speaks not about "would have been gotten" but about what "are observed". I think that that is a stronger argument. :smile:
Without the assumption that there is counterfactual definiteness at 0 degrees, you can't conclude that the percentage (i.e. the probability) of mismatches at -30 and 30 is less than or equal to the percentage of mismatches at -30 and 0 plus the percentage of mismatches at 0 and 30. The "successful" local hidden variable models you're talking about, like the ones zonde was referring to, do not actually reproduce all the experimental predictions of quantum mechanics. [..]
Herbert's proof asserts something slightly different from Bell's theorem, as I emphasised earlier: his claim isn't about theory but about facts of nature. What I called "successful" is to reproduce those measurement facts (real ones as opposed to imagined ones) with a "local realistic" model of which Herbert's proof asserts that they cannot be possibly reproduced by such a model.

It reminds me a bit of Ehrenfest's perfectly stiff disk: according to SR it cannot be made to rotate, but it has not been possible to disprove that aspect of SR - simply because SR contains the "loophole" that such a disk cannot be made. :wink:
 
Last edited:
  • #55
I don't see any difference between the theorem Herbert is proving and the one Bell is proving. Especially in the light of Arthur Fine's (1982) theorem showing the equivalence of the CHSH inequalities and the existence of a joint probability distribution of the outcomes of all the different measurements on the two particules.
 
  • #56
harrylin said:
That's the subtle detail that I disagree with: he speaks not about "would have been gotten" but about what "are observed". I think that that is a stronger argument. :smile:
But if the detectors are oriented at -30 degrees and 30 degrees, speaking about 0 degrees is clearly counterfactual reasoning. Herbert refers to the "binary message", the sequence of 0's and 1's you would have gotten if you oriented the detectors 0 degrees, and he considers mismatches between the -30 degree detector and the 30 degree detector to arise from deviations from this initial binary sequence. That is how he is able to say that a mismatch between -30 and 30 requires a mismatch between -30 and 0 or a mismatch between 0 and 30, and thus the percentage of mismatches between -30 and 30 is less than or equal to the percentage of mismatches between -30 and 0 plus the percentage of mismatches between 0 and 30.
Herbert's proof asserts something slightly different from Bell's theorem, as I emphasised earlier: his claim isn't about theory but about facts of nature.
But the "facts of nature" that Herbert discusses have not been entirely confirmed by experiments in a way that skeptics cannot dispute. If you ask zonde, he will insist vehemently that current experiments do not allow you to definitively test the claim of quantum mechanics that entangled photons exhibit identical behavior at identical angles, due to various loopholes like fair sampling and detector efficiency that currently practical Bell tests fall victim to. But what Herbert is showing, and I think Bell was showing the same things, is that if we accept that quantum mechanics is completely right about all its experimental predictions, like identical behavior at identical angles, then no local hidden variable theory will be able to account all of these facts of nature.
What I called "successful" is to reproduce those measurement facts (real ones as opposed to imagined ones) with a "local realistic" model of which Herbert's proof asserts that they cannot be possibly reproduced by such a model.
But in Herbert's proof, we are talking about "imagined" measurement facts, at least for now, because the experiment he discusses is an ideal Bell test free from experimental loopholes, and we haven't done such a perfect experiment yet (although we're getting there...). But you're right, if the empirical facts of nature are as Herbert (and quantum mechanics) say they are, then the thesis that reality is local can be deemed rejected.
 
  • #57
gill1109 said:
Nobel laureate Gerard t'Hooft (who I have talked to about this a number of times) is a superdeterminist when we are talking about the quantum world and what might be below or behind it at even smaller scales; what he apparently can't realize is that Bell's argument applies to objects in the macroscopic world, or supposed macroscopic world - actual detector clicks and the clicks which the detectors would have made if they had been aligned differently.
How did t'Hooft respond when you brought up this point to him?
 
  • #58
't Hooft didn't understand the point. Nor when other colleagues tried to explain it to him.

About Herbert's proof: Bell's theorem is about counterfactual outcomes of not performed measurements. Herbert is careless in his language (or is not being explicit enough). By definition, no experiment can ever prove the theorem.

Experiments can merely confirm the predictions of QM. Good experiments do that in situations which rule out local realist explanations via e.g. exploitation of the detection loophole, or through the setting in one wing of the experiment being in principle available in the other wing before conclusion of the measurement. Good experiments incorporate as physical constraints, the assumptions which are made in the proof.

E.g. The outcomes are +1 or -1; not +1 or -1 or "no show". The function A *can't* depend on b because the value of b can't be available...
 
  • #59
lugita15 said:
But if the detectors are oriented at -30 degrees and 30 degrees, speaking about 0 degrees is clearly counterfactual reasoning. Herbert refers to the "binary message", the sequence of 0's and 1's you would have gotten if you oriented the detectors 0 degrees, [..]
There you go again! And again I must reply: no, he refers to the sequence that he claims that you obtain each time when you orient the detectors 0 degrees. That is not about a conditional, hypothetical experience of a non-observed photon, but a factual experience of observed events. I found that really nice.
and he considers mismatches between the -30 degree detector and the 30 degree detector to arise from deviations from this initial binary sequence. That is how he is able to say that a mismatch between -30 and 30 requires a mismatch between -30 and 0 or a mismatch between 0 and 30, and thus the percentage of mismatches between -30 and 30 is less than or equal to the percentage of mismatches between -30 and 0 plus the percentage of mismatches between 0 and 30. But the "facts of nature" that Herbert discusses have not been entirely confirmed by experiments in a way that skeptics cannot dispute. If you ask zonde, he will insist vehemently that current experiments do not allow you to definitively test the claim of quantum mechanics that entangled photons exhibit identical behavior at identical angles, due to various loopholes like fair sampling and detector efficiency that currently practical Bell tests fall victim to.
What mattered to me was that Herbert made a seemingly rock solid claim about Nature and possible models of Nature that has been falsified - and I was frustrated because I did not find the error. Zonde was so kind to point the error out to me.
But what Herbert is showing, and I think Bell was showing the same things, is that if we accept that quantum mechanics is completely right about all its experimental predictions, like identical behavior at identical angles, then no local hidden variable theory will be able to account all of these facts of nature.
From reading up on this topic I discovered that there is some fuzziness about what exactly QM predicts for some real measurements; but the models that I heard about accurately reproduce what is measured in a typical Herbert set-up. Following your logic, we should conclude that QM is wrong. However, I think that that is not necessarily the case.
But in Herbert's proof, we are talking about "imagined" measurement facts, at least for now, because the experiment he discusses is an ideal Bell test free from experimental loopholes, and we haven't done such a perfect experiment yet (although we're getting there...). But you're right, if the empirical facts of nature are as Herbert (and quantum mechanics) say they are, then the thesis that reality is local can be deemed rejected.
Well Herbert fooled me there - and apparently he was fooled himself. :rolleyes:
 
Last edited:
  • #60
harrylin said:
From reading up on this topic I discovered that there is some fuzziness about what exactly QM predicts for some real measurements...

I am not aware of any controversy with regards to the predictions of QM in any particular setup. Every experimental paper (at least those I have seen) carefully compares the QM predictions to actual results, usually in the form of a graph and an accompanying table. These are peer-reviewed.
 
  • #61
harrylin said:
There you go again! And again I must reply: no, he refers to the sequence that he claims that you obtain each time when you orient the detectors 0 degrees. That is not about a conditional, hypothetical experience of a non-observed photon, but a factual experience of observed events. I found that really nice.
Let me try again. This is the crucial step where counterfactual definiteness is invoked: "Starting with two completely identical binary messages, if A's 30 degree turn introduces a 25% mismatch and B's 30 degree turn introduces a 25% mismatch." He is very clear, A's 30 degreee turn introduces a 25% mismatch from the 0 degree binary message. He is saying this even in the case when B has also turned his detector, so that no one is actually measuring this particular bit of the 0 degree binary message. The only bits of the 0 degree binary message that are actually observed are the ones for which one of the detectors was turned to 0 degrees. And yet he is asserting that even when neither of the detectors are pointed at 0 degrees, the mismatches between the two detectors still represent errors from the 0 degree binary message. Isn't discussion of deviation from unmeasured bits of a binary message a clear case of counterfactual definiteness, AKA realism?
What mattered to me was that Herbert made a seemingly rock solid claim about Nature and possible models of Nature that has been falsified
Perhaps Herbert should have phrased his claim slightly less boldly, because various practical loopholes make it hard to perfectly do the experiment he is talking about. But while it is true that experimental limitations prevent us at the current moment from absolutely definitively ruling out all local hidden variable models, we're getting there quickly, as I think zonde has said.
- and I was frustrated because I did not find the error. Zonde was so kind to point the error out to me.
Herbert is not making any "errors". The main point of the proof, even if Herbert didn't state it quite like this, is to to show that unless quantum mechanics is wrong about the experimental predictions it makes concerning entanglement, we can deem local hidden variable models to be ruled out.
From reading up on this topic I discovered that there is some fuzziness about what exactly QM predicts for some real measurements;
No, there isn't.
but the models that I heard about accurately reproduce what is measured in a typical Herbert set-up.
First of all, the term "Herbert set-up" is a bit cringe-inducing; as Herbert himself says, "It has appeared in some textbooks as "Herbert's Proof" where I would have preferred "Herbert's Version of Bell's Proof"". (And as I told you before, although Herbert apparently came up with it independently, the -30, 0, 30 example was the one used by Bell when he tried to explain his proof to popular audiences.)

But anyway, you're right that there are local hidden variable models that are not unequivocally ruled out by currently practical Bell tests. But that probably says more about current experimental limitations than it does about the success of those models.
Following your logic, we should conclude that QM is wrong.
No, we shouldn't. If a perfect, loophole-free Bell test, like the one Herbert envisions, gave results consistent with the possibility of a local hidden variable model, then yes there may be just cause to abandon QM. But until that time, how can you conclude such a thing from the logic?
Well Herbert fooled me there - and apparently he was fooled himself. :rolleyes:
No, I don't think so. The only point I'd concede is that he might want to qualify his remarks in all caps that "NO CONCEIVABLE LOCAL REALITY CAN UNDERLIE THE LOCAL QUANTUM FACTS." If he added "ASSUMING THAT THEY ARE INDEED FACTS, WHICH THEY SEEM TO BE", then it would be fine.
 
  • #62
DrChinese said:
I am not aware of any controversy with regards to the predictions of QM in any particular setup. Every experimental paper (at least those I have seen) carefully compares the QM predictions to actual results, usually in the form of a graph and an accompanying table. These are peer-reviewed.
I was thinking of for example Bell's idea about experiments that he seems to have thought that according to QM are possible, but that until now were not possible in reality; and for example Weih's experiment, which yields results of which the exact QM predictions are unclear for large time windows. Maybe we should start a discussion topic about that?
 
  • #63
lugita15 said:
Let me try again. This is the crucial step where counterfactual definiteness is invoked: "Starting with two completely identical binary messages, if A's 30 degree turn introduces a 25% mismatch and B's 30 degree turn introduces a 25% mismatch." He is very clear, A's 30 degreee turn introduces a 25% mismatch from the 0 degree binary message. He is saying this even in the case when B has also turned his detector, so that no one is actually measuring this particular bit of the 0 degree binary message. The only bits of the 0 degree binary message that are actually observed are the ones for which one of the detectors was turned to 0 degrees. And yet he is asserting that even when neither of the detectors are pointed at 0 degrees, the mismatches between the two detectors still represent errors from the 0 degree binary message. Isn't discussion of deviation from unmeasured bits of a binary message a clear case of counterfactual definiteness, AKA realism?
Sorry, I never understood such discussions and words - which is why I prefer Herbert's formulation, and even Bell's. And I already stated how I interpret that: we assume that the rotation of the detector doesn't affect the stream of whatever is coming towards the detector. If people call that "counterfactual definitness", that's fine to me. It's certainly what I call "local realism" aka "no spooky action at a distance".
[..]The only point I'd concede is that he might want to qualify his remarks in all caps that "NO CONCEIVABLE LOCAL REALITY CAN UNDERLIE THE LOCAL QUANTUM FACTS." If he added "ASSUMING THAT THEY ARE INDEED FACTS, WHICH THEY SEEM TO BE", then it would be fine.
Sure - my point was that he presented non-facts as facts, and I fell into that trap.
 
  • #64
harrylin said:
Please elaborate - you seem to suggest to have spotted another flaw in Herbert's proof, but it's not clear to me what you mean.
Not a flaw in Herbert's proof. But in his interpretation of the physical meaning of his proof.
 
Last edited:
  • #65
gill1109 said:
Nobel laureate Gerard t'Hooft (who I have talked to about this a number of times) is a superdeterminist when we are talking about the quantum world and what might be below or behind it at even smaller scales; what he apparently can't realize is that Bell's argument applies to objects in the macroscopic world, or supposed macroscopic world - actual detector clicks and the clicks which the detectors would have made if they had been aligned differently.
Of course. And that's all that Bell's theorem applies to. As billschneider has said repeatedly.

What Bell's theorem doesn't apply to, as far as anybody can ascertain, is whatever is happening in the reality underlying instrumental behavior. So, it doesn't inform wrt whether nature is local or nonlocal in that underlying reality.

t'Hooft is a superdeterminist? Interesting. I would have thought him to have a better approach to the interpretation of Bell's theorem than that.
 
  • #66
ThomasT said:
Not a flaw in Herbert's proof. But in his interpretation of the physical meaning of his proof.
His proof is a proof (or so he claims) about the physical meaning of observations.
 
  • #67
harrylin said:
OK thanks for the clarification - that looks very different! :-p

So one then has in reality for example:
- step 1: 90% mismatch
- step 2: 92.5% mismatch
- step 3: 92.5% mismatch
- step 4: 97.5% mismatch.

Based on local reality and applying Herbert's approach, I find that in case of a mismatch of 90% at step 1, the mismatch of step 4 should be <= 185%. Of course, that means <=100%.
Note: in a subsequent thread on why hidden variables imply a linear relationship, it became clear that imperfect detection is not the only flaw in Herbert's claim about facts of measurement reality; another issue that Herbert missed is the effect of data picking such as with time coincidence windows.
- https://www.physicsforums.com/showthread.php?t=589923&page=6
 
  • #68
harrylin said:
Note: in a subsequent thread on why hidden variables imply a linear relationship, it became clear that imperfect detection is not the only flaw in Herbert's claim about facts of measurement reality; another issue that Herbert missed is the effect of data picking such as with time coincidence windows.
- https://www.physicsforums.com/showthread.php?t=589923&page=6
There are, of course, numerous experimental loopholes in current Bell tests. Herbert isn't concerned with loopholes. The point is that the local determinism is fundamentally in contradiction with the empirical predictions of QM. Whether this empirical disagreement is practically testable given current experimental limitations is, to me, beside the point.
 
  • #69
lugita15 said:
There are, of course, numerous experimental loopholes in current Bell tests. Herbert isn't concerned with loopholes. The point is that the local determinism is fundamentally in contradiction with the empirical predictions of QM. Whether this empirical disagreement is practically testable given current experimental limitations is, to me, beside the point.
Obviously we continue to disagree about what Herbert claimed to have proved; but everyone can read Herbert's claims and we have sufficiently discussed that.
 
  • #70
harrylin said:
Obviously we continue to disagree about what Herbert claimed to have proved; but everyone can read Herbert's claims and we have sufficiently discussed that.
I completely agree with you that Herbert worded his conclusion a bit too strongly, because he took for granted that QM is correct in its experimental predictions, an assumption that has overwhelming evidence backing it up, but not definitive proof due to various loopholes like detector efficiency.
 
Back
Top