Joy Christian, Disproof of Bell's Theorem

In summary: The article is discussing Joy Christian's latest paper which purports to disprove Bell's theorem. The article does a good job of summarizing the paper and its contents. The article does not provide a verdict on whether or not the paper is correct or significant.
  • #106


Hehe, what's funny is that as I found this paper on the archives yesterday, my first thought was: wow, DrChinese will find that funny.

On another note, I think the strong language at the end of the abastract suggests that some people in the community is starting to get annoyed by joy christians continuing crusade against Bell. I guess he should maybe try to put up a bit more humble attitude in the future (assuming he has one :-p )
 
Physics news on Phys.org
  • #107


Delta Kilo said:
Told you so! :mad:

Ahead of the pack is a good place to be... :smile:
 
  • #108


Zarqon said:
Hehe, what's funny is that as I found this paper on the archives yesterday, my first thought was: wow, DrChinese will find that funny.

On another note, I think the strong language at the end of the abastract suggests that some people in the community is starting to get annoyed by joy christians continuing crusade against Bell. I guess he should maybe try to put up a bit more humble attitude in the future (assuming he has one :-p )

Heh, I'm so predictable...

Yes, I think the issue is: if someone (such as Christian) really has an angle on something, why not collaborate on it rather than this process of trying to upend something which has been thoroughly studied (Bell)? Every entanglement test shows the same pattern of impossibly high correlations, which again should be a tip-off. Some mathematical sleight of hand is not going to do it, there is going to need to be something very convincing - something like a new testable prediction.
 
  • #109


DrChinese said:
Posted today by Richard Gill, of the Mathematical Institute:

http://arxiv.org/abs/1203.1504

Abstract:

"I point out a simple algebraic error in Joy Christian's refutation of Bell's theorem. In substituting the result of multiplying some derived bivectors with one another by consultation of their multiplication table, he confuses the generic vectors which he used to define the table, with other specific vectors having a special role in the paper, which had been introduced earlier. The result should be expressed in terms of the derived bivectors which indeed do follow this multiplication table. When correcting this calculation, the result is not the singlet correlation any more. Moreover, curiously, his normalized correlations are independent of the number of measurements and certainly do not require letting n converge to infinity. On the other hand his unnormalized or raw correlations are identically equal to -1, independently of the number of measurements too. Correctly computed, his standardized correlations are the bivectors - a . b - a x b, and they find their origin entirely in his normalization or standardization factors; the raw product moment correlations are all -1. I conclude that his research program has been set up around an elaborately hidden but trivial mistake. "

--------------------------------------------

It is interesting to add this note, addressed to those who suggest Jaynes is the only person who properly understands how probability applies to Bell's Theorem, entanglement, etc: Gill is also an expert in statistical theory, and has done extensive research in this area (including the application of Bayes). He apparently does not see the issue Jaynes does. Gill frequently collaborates with the top scientists in the study of entanglement, so I think it is safe to say this area has been well considered and has not been overlooked somehow.
I thought at first that Christian might be on to something, because I intuited a connection between his approach and mine. But, after further consideration, imho, his stuff is just too mathematically circuitous to be considered. I've read his papers and his replies to various discussions, and in none of it is there a clear explanation of why his stuff should be considered a local realistic model of quantum entanglement.
 
  • #110


Richard Gill's refutation is not a new critique. It is essentially the same as one of the critiques advanced by a certain Florin Moldoveanu in the fall last year to which Joy Christian has already replied (http://arxiv.org/abs/1110.5876). It originates from a misunderstanding of Joy's framework which admittedly is not very easy to understand especially for those who have blinders of one kind or another.

Gill thinks Joy is using a convoluted more difficult method to do a calculation and prefers a different method which ultimately leads him to a different result, not realizing/understanding that the calculation method Joy used is demanded by his framework. This is hardly a serious critique not unlike his failed critique of Hess and Phillip. He should at least have read Joy's response to Moldoveanu which he apparently did not, since he does not cite or mention it. It's been available since October 2011, one-month after Moldoveanu posted his critique.

I remember Florin came here to boast about his critique and I pointed out his misunderstanding at the time in this thread: https://www.physicsforums.com/newreply.php?do=newreply&noquote=1&p=3806400

... you are missing the point because Joy Christian is not using handedness as a convention but as the hidden variable itself.
This is the same error Gill has made. See section (II) of Joy's response to Moldoveanu.
 
Last edited by a moderator:
  • #111
  • #112


bohm2 said:
More on this from Joy Christian and I don't understand any of it:

Refutation of Richard Gill's Argument Against my Disproof of Bell's Theorem
http://lanl.arxiv.org/pdf/1203.2529.pdf

Oh-ho, here we go again. No, Joy, measurement outcomes are not bivectors from unit sphere, they are numbers { -1; 1 }. That's how they are defined in Bell's paper and that is also the way how they come out of experiments. And their mean is 0 and their standard deviation is 1. Not bivectors, just numbers 0 and 1.

Joy Christian said:
with [itex]\sigma(A) = (−I · \textbf{a} )[/itex] and [itex]\sigma(B) = (+I · \textbf{b} )[/itex], respectively, being the standard deviations in the results A and B.
I can't be bothered anymore, but if you substitute [itex]I[/itex] and [itex]\textbf{a}[/itex] from definitions elsewhere in his paper, you will get [itex]\sigma(\textbf{a})=\sum a_{j}\beta_{j}[/itex] where [itex]a_{j}[/itex] are coefficients of unit vector [itex]\textbf{a}[/itex] and [itex]\beta_{j}[/itex] are "basis bivectors". Brain ruptures at this point...
 
  • #113


billschnieder said:
Richard Gill's refutation is not a new critique. It is essentially the same as one of the critiques advanced by a certain Florin Moldoveanu in the fall last year to which Joy Christian has already replied (http://arxiv.org/abs/1110.5876). It originates from a misunderstanding of Joy's framework which admittedly is not very easy to understand especially for those who have blinders of one kind or another.

Gill thinks Joy is using a convoluted more difficult method to do a calculation and prefers a different method which ultimately leads him to a different result, not realizing/understanding that the calculation method Joy used is demanded by his framework. This is hardly a serious critique not unlike his failed critique of Hess and Phillip. He should at least have read Joy's response to Moldoveanu which he apparently did not, since he does not cite or mention it. It's been available since October 2011, one-month after Moldoveanu posted his critique.

I remember Florin came here to boast about his critique and I pointed out his misunderstanding at the time in this thread: https://www.physicsforums.com/newreply.php?do=newreply&noquote=1&p=3806400 This is the same error Gill has made. See section (II) of Joy's response to Moldoveanu.

It's true that Moldoveanu had earlier seen the same error, in a sense ... but Joy's definitions have not remained constant over the years, so it's a moot point whether the error in one of the earlier, long accounts, is the same error as in Joy's beautiful and simple one-page paper. Florin's focus was not the one-page paper, but the whole corpus of work at that point.

Joy and Bill Schnieder may find it legitimate, when one has freedom to make an arbitrary choice of "handedness", to make different and mutually contradictory choices at different locations in the same computation, but to my mind this is just license to get any result one likes by use of poetry.

Joy's one page paper and my refutation are exercises in simple algebra. I suggest that Bill Schnieder and others work through my algebra and through Joy's algebra, themselves.

The reference to Hess and Phillip is also amusing. Not many people actually read through all the details of Hess and Phillips "counterexample" to Bell's theorem. Somewhere in the midst of that, a variable which had three indices suddenly only had two. This is where a joint probability distribution is being factored into a marginal and the product of two conditionals. Because of the notational slip-up, the normalization factor was wrong. All rather sad.
 
Last edited by a moderator:
  • #114


DrChinese referred to Jaynes. Jaynes (1989) thought that Bell was incorrectly performing a routine factorization of joint probabilities into marginal and conditional. Apparently Jaynes did not understand that Bell was giving physical reasons (locality, realism) why it was reasonable to argue that two random variables should be conditionally *independent* given a third. When Jaynes presented his resolution of the Bell paradox at a conference, he was stunned when someone else gave a neat little proof using Fourier analysis that the singlet correlations could not be reproduced using a network of classical computers, whose communication possibilities "copy" those of the traditional Bell-CHSH experiments. I have written about this in quant-ph/0301059. Jaynes is reputed to have said "I am going to have to think about this, but I think it is going to take 30 years before we understand Stephen Gull's results, just as it has taken 20 years before we understood Bell's" (the decisive understanding having been contributed by E.T. Jaynes.
 
  • #115


PS, Bill Schnieder thinks that I prefer a different route to get Joy Christian's result because it gives a different answer, but this means he has not read my paper carefully. I discovered a short route, and it appeared to give Joy's answer. I showed this proudly to Joy. He pointed out that I was making a mistake, there was a missing term. I went back and looked more closely at his longer route, and discovered that they both gave the same answer. With the missing term.
 
  • #116


Just curious. Doesn't the new PBR theorem reach the same conclusion as Bell's making Joy Christian's refutation of Bell's theorem (even if it was conceivable) a mute point, at least with respect to arguing for a local realistic model:
Thus, prior to Bell’s theorem, the only open possibility for a local hidden variable theory was a psi-epistemic theory. Of course, Bell’s theorem rules out all local hidden variable theories, regardless of the status of the quantum state within them. Nevertheless, the PBR result now gives an arguably simpler route to the same conclusion by ruling out psi-epistemic theories, allowing us to infer nonlocality directly from EPR.
Quantum Times Article on the PBR Theorem
http://mattleifer.info/2012/02/26/quantum-times-article-on-the-pbr-theorem/

The quantum state cannot be interpreted statistically
http://lanl.arxiv.org/pdf/1111.3328v1.pdf
 
  • #117


gill1109 said:
DrChinese referred to Jaynes. Jaynes (1989) thought that Bell was incorrectly performing a routine factorization of joint probabilities into marginal and conditional. Apparently Jaynes did not understand that Bell was giving physical reasons (locality, realism) why it was reasonable to argue that two random variables should be conditionally *independent* given a third. When Jaynes presented his resolution of the Bell paradox at a conference, he was stunned when someone else gave a neat little proof using Fourier analysis that the singlet correlations could not be reproduced using a network of classical computers, whose communication possibilities "copy" those of the traditional Bell-CHSH experiments. I have written about this in quant-ph/0301059. Jaynes is reputed to have said "I am going to have to think about this, but I think it is going to take 30 years before we understand Stephen Gull's results, just as it has taken 20 years before we understood Bell's" (the decisive understanding having been contributed by E.T. Jaynes.

Thanks so much for taking time to share this story. For those interested, here is the direct link to your paper:

http://arxiv.org/abs/quant-ph/0301059

I like your example of Luigi and the computers. I would recommend this paper to anyone who is interested in understanding the pros AND cons of various local realistic positions - and this is a pretty strong roundup!
 
  • #118


Thanks, Bohm2 and thanks DrChinese.

Regarding PBR: I have to admit to have not got the point of PBR. PBR argue that the quantum state is not statistical, but real. That argument depends on definitions of those two words "statistical", "real". My own opinion about quantum foundations is summarized by statements that (1) the real world is real, and its past is now fixed (2) the future of the real world is random, (3) the quantum state is what you need to know about the past in order to determine the probability distribution of the future (so it's just as real as the real world, if you like, since the past real world is real and the probability distribution of the future is real too). This point of view is argued in http://arxiv.org/abs/0905.2723 which is actually just an attempt to explain the ideas which I got from V.P. Belavkin But you could also say that this is just a rigorous Copenhagen approach in which we don't talk about things which we don't need to, and in which we admit the necessity of defining quantum physics on a platform of naive classical physics.
 
  • #119


gill1109 said:
DrChinese referred to Jaynes. Jaynes (1989) thought that Bell was incorrectly performing a routine factorization of joint probabilities into marginal and conditional. Apparently Jaynes did not understand that Bell was giving physical reasons (locality, realism) why it was reasonable to argue that two random variables should be conditionally *independent* given a third. When Jaynes presented his resolution of the Bell paradox at a conference, he was stunned when someone else gave a neat little proof using Fourier analysis that the singlet correlations could not be reproduced using a network of classical computers, whose communication possibilities "copy" those of the traditional Bell-CHSH experiments. I have written about this in quant-ph/0301059. Jaynes is reputed to have said "I am going to have to think about this, but I think it is going to take 30 years before we understand Stephen Gull's results, just as it has taken 20 years before we understood Bell's" (the decisive understanding having been contributed by E.T. Jaynes.
Thanks for giving your opinion on this matter which happens to be the discussion topic of a parallel thread:
https://www.physicsforums.com/showthread.php?t=581193
I can copy your comment there, but it would be nicer if you would do it yourself. :smile:
 
  • #120


harrylin said:
Thanks for giving your opinion on this matter which happens to be the discussion topic of a parallel thread:
https://www.physicsforums.com/showthread.php?t=581193
I can copy your comment there, but it would be nicer if you would do it yourself. :smile:

I copied my comment + reference over there, which has the effect of including the above.
 
  • #121


DrChinese said:
I copied my comment + reference over there, which has the effect of including the above.
Looking at the time stamp, we had the same idea at the same time. :-p
 
  • #122


Delta Kilo said:
Oh-ho, here we go again. No, Joy, measurement outcomes are not bivectors from unit sphere, they are numbers { -1; 1 }. That's how they are defined in Bell's paper and that is also the way how they come out of experiments. And their mean is 0 and their standard deviation is 1. Not bivectors, just numbers 0 and 1.

I can't be bothered anymore, but if you substitute [itex]I[/itex] and [itex]\textbf{a}[/itex] from definitions elsewhere in his paper, you will get [itex]\sigma(\textbf{a})=\sum a_{j}\beta_{j}[/itex] where [itex]a_{j}[/itex] are coefficients of unit vector [itex]\textbf{a}[/itex] and [itex]\beta_{j}[/itex] are "basis bivectors". Brain ruptures at this point...

So that pretty much destroys Joy's response to the argument against his original paper?
 
  • #123


Joy Christian has now responded to Richard Gill's purported refutation:

http://arxiv.org/abs/1203.2529

I identify a number of errors in Richard Gill’s purported refutation of my disproof of Bell’s theorem.
In particular, I point out that his central argument is based, not only on a rather trivial misreading
of my counterexample to Bell’s theorem, but also on a simple oversight of a freedom of choice in
the orientation of a Clifford algebra. What is innovative and original in my counterexample is thus
mistaken for an error, at the expense of the professed universality and generality of Bell’s theorem.
 
  • #124


Thanks, Bill Schnieder. Joy has changed his postulates to patch the error. The new postulates are mutually contradictory. So first there was a model and a mistake, now there's no mistake but no model either. Vanished in a puff of smoke.
 
  • #125


I posted that paper in this thread above but I gave up trying to understand the debate. A very long one and not too friendly one that can be followed more fully here in this FQXi Blog:

Disproofs of disproofs of disproofs of disproofs...
http://www.fqxi.org/community/forum/topic/1247
 
  • #126


bohm2 said:
Just curious. Doesn't the new PBR theorem reach the same conclusion as Bell's making Joy Christian's refutation of Bell's theorem (even if it was conceivable) a mute point, at least with respect to arguing for a local realistic model:

Thus, prior to Bell’s theorem, the only open possibility for a local hidden variable theory was a psi-epistemic theory. Of course, Bell’s theorem rules out all local hidden variable theories, regardless of the status of the quantum state within them. Nevertheless, the PBR result now gives an arguably simpler route to the same conclusion by ruling out psi-epistemic theories, allowing us to infer nonlocality directly from EPR.

Quantum Times Article on the PBR Theorem
http://mattleifer.info/2012/02/26/quantum-times-article-on-the-pbr-theorem/

The quantum state cannot be interpreted statistically
http://lanl.arxiv.org/pdf/1111.3328v1.pdf

PBR place strong constraints on epistemic interpretations rather than rule out.
 
Last edited:
  • #127


yoda jedi said:
PBR place a strong constraints on psi-epistemic interpretations rather than rule out.
My question really wasn't about this point. Joy Christian's preservation of local realism relies on refutation of Bell's. Even if that could be done, my question was whether non-locality can be inferred directly via PBR without Bell's theorem. Matt Leifer in his blog answered in a post:

Question by poster:
Hi Matt, Do you still believe that PBR directly implies non-locality, without Bell’s as I think you argued in a section of Quantum Times article?
“It (PBR) provides a simple proof of many other known theorems, and it supercharges the EPR argument, converting it into a rigorous proof of nonlocality that has the same status as Bell’s theorem. ”
Matt's reply:
Yes, but this requires the factorization assumption used by PBR. At the time of writing, I was hopeful that we could prove the PBR theorem without factorization, but now I know that this is not possible. Therefore, the standard Bell-inequality arguments are still preferable as they involve one less assumption.
Quantum Times Article on the PBR Theorem
http://mattleifer.info/2012/02/26/q...-the-pbr-theorem/comment-page-1/#comment-2877
 
  • #128
Last edited:
  • #129


yoda jedi said:
i understand, in the same manner, like your question of "Loophole-free demonstration of nonlocality".

Exactly.
 
  • #130


my 2-form's worth on the subject: Its been well known since the work of Philippe Eberhard and Arthur Fine in 70s and 80s that a model produces Bell's Inequalities if and only if it is equivalent to a "local hidden variable theory". The term "local hidden variable theory" has a precise mathematical definition it doesn't simply mean any model that has quantities unknown to the observer that determine all outcomes, the quantities have to have sufficient structure to allow probabilities to be calculated via the mechanisms of formal probability theory (sigma algebras etc etc). Some of the people who have commented in this topic seem to have a vague grasp of this when they describe Joy Christian's example as "unrealistic" etc etc but they are missing the point, Fine and Eberhard showed that any such example is "unrealistic" in this sense, but what Christian and others show is that such "unrealistic" behaviour is not really unrealistic and it manifests in some fairly simple models of EPR experiments, like the one Christian manages to present in a single page.
 
  • #131


Unfortunately Christian's single page contains a glaring error in the algebra, as well as being conceptually completely misguided, see http://arxiv.org/abs/1203.1504

On the other hand, no technically advanced parts of formal probability theory are needed to derive Bell inequalities from a local hidden variables model. No sigma algebras or whatever. They follow from absolutely elementary logical reasoning, elementary arithmetic, elementary (counting) probability, see for instance http://arxiv.org/abs/1207.5103
 
  • #132


I'll take a look at the links you provide. The derivation of Bell's Inequality doesn't need any knowledge of sigma algebras etc, but standard probability structure is implicit when one takes various averages to produce the inequality and assumes them to be meaningful. If you look at Fine's work in the 80s the key to what's going on is that local hidden variable theories always have well defined joint probability distributions for pairs of variables which really don't according to QM. Taking Eberhard's work into consideration, local hidden variable theories have this problem because they implicitly assume counterfactual definiteness.

Although a lot of people have an issue with denying counterfactual definiteness, they shouldn't its something that fails even in mundane examples such as asking what's in your fist when your palm is flat or what's on your lap when you are standing. ("Fist contents" and "flat palm orienation" are incompatible observables and do not have well defined joint probabilities) In layman's terms what's going wrong in hidden variable theories is that they are insisting that you can talk about having something in your fist at the same time as your palm is flat and as a result they can never produce correct correlations.

Now maybe Joy Christian's example is erroneous (I will check the refutation) but there have been many others who have come up with correct similar examples - and all they are really doing is constructing something which loosely speaking amounts the "fist-palm" example using some sort of non-distributive lattice or tensor algebra of some kind. The reason why such examples are valuable is that they show people that they have to be careful in jumping to conclusions about non-locality or denial of philosohical realism simply because a standard hidden variable theory cannot produce QM correlations. They also show that there are mundane cases where you can't treat a pair of variables as having a well defined joint probability distribution (or without going into such detail, where you can't just average stuff and assume the average makes sense).

The main problem with Joy Christians work is his attitude - the very title "Disproof of Bell's Theorem" is troublesome/cranky because it isn't disproving Bell or finding an error is Bell's reasoning, its merely trying to come up with an example of something we've known about since the 70s basically.
 
  • #133


At the heart of the debate over Bell, local realism, and such, is an implicit assumption that events originate with emission and end with measurment. More specifically, the applicability of Bell's Theorem to nature and certain conclusions about CHSH experiments depend critically on the veracity of the assumption that entanglement involves the coincident emission of two particles that can then be identified, and thus have their entanglement tested, by a coincident detection. This assumption may not be correct.

I've shown that direct particle-to-particle interactions, or relationships, that share information at the speed of light for a non-zero duration are adequate to generate "quantum" results in CHSH experiments.

https://docs.google.com/open?id=0BxBGJRkQXyjweXR2R3ExTlEyNm8
 
Last edited:
  • #135


Ok, I read and thoroughly enjoyed Gill's refutation paper but have still to read Christian's rebuttal. I have to say at this point I don't actually understand Christian's calculations my initial understanding of the notation which seemed to confirm his result appears to be wrong. (Another regret I'm having, back in the day I remember thinking "quaternions? I'll never need this, I can always look them up one day if I do ... well the day finally came :D )

Regarding the concluding discussion in Gill's paper, "There is no limitation in Bell's theorem on the space in which the hidden variables live." Well yes true as long as we recognize the subsequent bit about realism and locality, in particular realism. Gills says: "Realism", however it is defined, comes down, effectively, to the mathematical existence of the outcomes of unperformed experiments, alongside of those which were actually performed." Well local models that do violate Bell's Inequality work precisely by encoding the lack of a well defined outcomes of unperformed experiments thus preventing the type of averaging or counting that is needed in the derivation of Bell's Theorem. I initially assumed that Christian was trying the same sort of thing but the algebra was over my head.

What for me is an important point is that such models are not really "non-realist" in the true philosophical sense. Although Bell's "realism" seems sensible on the surface it isn't sensible at all because it amounts to saying that we can talk sensibly about a particle being in a position eigenstate which we didn't measure and don't know at the same time that we did measure and do know that it is in a particular momentum eigenstate - and that is just plain nonsense. That it is nonsense is just a consequence of how Fourier transforms work (or more gnerally how change of eigenbases works) and there is nothing philisophically non-realist about it - so it its very unfortunate that Bell and others dubbed this "realism". The failue of this kind of realism is no worse than failing to have a lap when standing up, I'm not a spooky subjective entity because my lap disappeared when I stood up, but the fact that Bell and others seem to imply that it is (or alternatively try to imply that faster than light signalling exists) is a bit of crankiness on their part and its precisely what sets of off the anti-Bell cranks.
 
  • #136


BTW the sort of stuff I'm talking about when I speak of local models that violate Bell's Theorem, I mean the sort of things that Rovelli, Omnes, Hartle etc have come up with. They all reject non-local communication but have models consistent with ordinary QM correlations not with Bell's Theorem and they manage this by failing to be "realist" in Bell's narrow sense but are nevertheless still "realist" in a philosophical sense.
 
  • #137


Mathematech said:
BTW the sort of stuff I'm talking about when I speak of local models that violate Bell's Theorem, I mean the sort of things that Rovelli, Omnes, Hartle etc have come up with. They all reject non-local communication but have models consistent with ordinary QM correlations not with Bell's Theorem and they manage this by failing to be "realist" in Bell's narrow sense but are nevertheless still "realist" in a philosophical sense.

Thanks for this! Is there a good source for a precise definition of Bell's meaning of realism, as well as "consensus" definitions?
 
  • #138


Off hand I can't think of any good sources although I have read many confusing ones. When it comes to Bell's Theorem, "realism" means counterfactual definiteness. Counterfactual definiteness is typically poorly explained in texts often with some statement along the lines that if a different experiment had been performed (e.g. position measured instead of momentum) then it would have produced a definite result. Now that isn't quite what counterfactual definiteness is about as all interpretations of QM agree that if you do an experiment you get a definite result, even raw Copenhagen says that. What counterfactual definiteness really says is that such a counterfactual outcome is still statistically meaningful for a time where I have performed a different experiment that is quantum mechanically incompatible.

To give a real world non-QM analogy, assuming counterfactual definiteness in a QM situtaion amounts to counting how many times your fist contained some unspecified item according to a guess without having checked it but instead having seen to the contrary that your palm was open and that your hand wasn't even clenched in a fist all - such counts are obviously meaningless nonsense. Similarly refinements of Copenhagen QM avoid Bell's Theorem without the need for non-locality by similarly considering the calculations in the derivation of Bell's Theorem to be meaningless sums.
 
  • #139


The point is that counterfactual definiteness was never a problem in physics till QM came along. Secondly, the perfect anticorrelations predicted by the singlet state make it a very natural assumption (when you measure the two particles in the same way you get equal and opposite results - hard to imagine except by supposing the different measurement outcomes for different settings are already "fixed" for the two particles at the source). (cf EPR argument for "elements of reality").

Finally, the fact remains that it is impossible to generate violations of Bell inequalities in a rigorously regulated experiment by non-quantum means. (Rigorously regulated means: no post-selection, no non-detections; random settings; proper space-time separation so that Alice's measurement is finished before Bob's setting could become available and vice -versa). When the definitive experiment is done in a year or two (several experimental groups are getting very close) we'll know for sure that nature - quantum reality - is non classical. Nature is not deterministic but irreducibly stochastic.
 
  • #140


Mathematech said:
Although a lot of people have an issue with denying counterfactual definiteness, they shouldn't its something that fails even in mundane examples such as asking what's in your fist when your palm is flat or what's on your lap when you are standing. ("Fist contents" and "flat palm orienation" are incompatible observables and do not have well defined joint probabilities)
Joint probabilities are irrelevant: "fist contents", as you've defined it, is ill defined all on its own.
 

Similar threads

Replies
220
Views
20K
Replies
55
Views
7K
Replies
70
Views
17K
Replies
28
Views
2K
Replies
20
Views
2K
Replies
333
Views
14K
Back
Top