Local realism ruled out? (was: Photon entanglement and )

In summary, the conversation discussed the possibility of starting a new thread on a physics forum to discuss evidence for a specific perspective. The topic of the thread was related to the Bell theorem and its potential flaws on both theoretical and experimental levels. The original poster mentioned that their previous posts on this topic had been criticized, but their factual basis had not been challenged until recently. They also noted that the measurement problem in quantum mechanics is a well-known issue and cited a paper that they believed supports the idea that local realism has not been ruled out by existing experiments. The other participant in the conversation disagreed and stated that the paper did not rule out local realism and provided additional quotes from experts in the field. Ultimately, the conversation concluded with both parties holding differing views
  • #771
jambaugh said:
...but rather a framework of material phenomena = reality
......:zzz:
 
Physics news on Phys.org
  • #772
Ilja said:
I disagree. Philosophy can, without any problem, go back to their state of 1900. At this time, no philosopher had a problem with the speed of Newtonian gravity being greater than c.
There is a colossal difference between "greater than c" (which conforms with "LR") and infinite speed (which does not conform with "LR"). I searched but could not find a claim by Newton that the speed of gravity is infinite (I can imagine that he overlooked it, or that he assumed it to be so great as not to matter), and I don't know of any philosopher of around 1900 who did think that such a proposition is no problem. Can you cite one? (I only ask for one, no need for all).
It is the naming convention "local realism" which is highly misleading here. It strongly suggests that one should give up realism. Then, the second alternative, named "local", also sounds as if there has to be given up something which was always assumed to be true [..]
I find Einstein's own formulation, "spooky action at a distance" much better: for it relates to the implausible infinite speed of QM as well as to the implausible infinite and unfailing working range of QM, independent of distance. Can the De Broglie's model account for that??
akhmeteli said:
[..] My understanding there is no such proof yet, as there is no evidence of faster-than-light communication or loophole-free evidence of violations of the Bell inequalities. [..].
Yes, I agree. All experiments that I looked into so far may be understood to work by exploiting loopholes (or even big holes). By now I regard a "loophole free" experiment in the same way as an experiment that "breaks the PoR" - I won't wait for it.
jtbell said:
Uh... Ilja, did you notice the dates on the posts that you replied to?
It remains a hot topic; but perhaps this thread is getting to long... Need for an index! :biggrin:
 
Last edited:
  • #773
audioloop said:
......:zzz:

Just because you can use (at the classical scale) a reality model as your framework of material phenomena doesn't mean it is the best/only framework.

Phenomena = acts and actions (including e.g. observations measurements dynamic evolutions, system projection/preparation interaction et al.). Only with the algebra of actions is categorical is an object model appropriate.

The point to understand is that there is a class of languages broader than nominative languages. You can use Whitehead's nomenclature of object language and process language, but I prefer to say action language. We use an object language classically but we can recognize quantum mechanics as a action language. The semantic atoms are acts rather than objects.

Now there is a clear means of expressing objects in an action language by means of identities (in the sense of category theory). But to express actions in an object language you must restrict yourself to actions mapping object to object. This is a problem (as with Zeno's paradox). Action languages are a proper superset of object languages. There are more meaningful statements you can make in an action language (at the same level of abstraction) than you can make in an object language. (one can always construct a object language one higher level of abstraction up, objectifying the reference to acts, e.g. "running" as a noun. Note that this is what happens when we reify the wave function and it is a form of Platonism.)

What this boils down to (and where it is relevant to this topic) is that recognizing quantum mechanics as an action language we need no further interpretation when we interpret the symbolism in terms of phenomenological acts. "prepare system, dynamically evolve, selectively detect, measure", "compose actions" <--> "bra" "U" "ket", "X", "AB=C".
It is the person trapped in object language thinking that must insist that these actions only terminate in objective states.

The "reality" in QM is relativized. You can work in a perfectly valid classical logic of states if you restrict your actions to only include a commuting subset of observables. Changing to a distinct subset of observables defines an alternative transformed "reality frame", just as you transform between moving observers and get a different "now" frame in SR. Just as with SR the transition in thinking from absolute to relative is difficult and many become intransigent insisting it is just wrong because they can't let go of their intuition of absolutes.

You see this in the various "disproofs of einstein" which demonstrate that we can get Einstein's predictions by overlaying a whole set of unobservable structure namely the aether.
Einstein rejected the aether because the predictive theory showed it to be fundamentally unobservable.

Reality qua reality is fundamentally unobservable. We observe through acts of measurement not immediate intimate clairvoyance into the state of reality as it is. This barrier is the same as the barrier in SR of observing distant events only through propagating intermediate causal phenomena (light or gravity or bouncing balls or something). In imagining a birds eye view of distant phenomena we imagine and develop an intuition of absolute simultaneity when we must deduce simultaneity from the phenomena of causal signals. Similarly we must deduce states of reality (at the classical scale) from the phenomena of measurements we make. When we push it to the extreme observables significantly fail to commute just as in SR the boosts fail to commute and we can no longer thing of velocities as additive (+ being a commuting product).

When in an action language actions fail to commute we loose the presumption that the quantities they change add. In QM it is the probabilities that cease to be an additive measure over a state space. You get Bell inequality violation which can only be resolved satisfactorily by rejecting the underlying absolute structure. With SR it was the fiber bundle of spatial universe fibers over the base of times, (the spatial universes being the set of simultaneous events in the universe), and we reject the underlying absolute structure of universal time. In QM you loose the fiber-bundle structure of logical values over the base of possible states of reality. In both cases you get a relativity principle and a unified composite structure. In SR it is unified space-time and in QM it is a unified manifold of possibilities. In SR we see mixing of space and time as we change frames and in QM we see the mixing of "reality" and "logic" that becomes probability and is though of as quantum uncertainty. It is a mixing of "reality" and "possibility" which manifests as probability.

What you must then learn to deal with is a limited "relative reality" and a whole lot more which doesn't fit into anyone reality frame just as in SR you have to deal with relative simultaneity and a whole lot of events that are neither unambiguously in the past nor in the future of a given event. In SR one ultimately stops thinking of "simultaneous" as meaningful and rather speaks of space-time separation (time-like, space-like, and light-like). In QM one should ultimately stop thinking of "real" as meaningful and rather speak of forbidden transitions, assured transitions and the in-between probabilistic transitions. Those transitions not being transitions between states of reality but transitions between equivalence classes of acts of determination.

OK, got on a tear. My overall theme is that one is in error holding onto the old ontological language of classical science. It is obstructive (not just in understanding QM). I assert the same error is made in social sciences where the subject under study is incorrectly objectified. One assumes people have "states of mind" and sees the explosion of diagnosed syndromes in psychology and psychiatry. One tries to act on "the economy" as if one need only move the observables around to match their values when it was "in a good state".

These are systems which like physical systems in QM cannot be reduced to objective states because the nature of the acts of observation are either intrinsically or as a matter of pragmatic necessity, intimate. The act of measurement is an interaction. The act of observing affects the observed as well as the observer.
 
  • #774
jambaugh said:
If you look at the unitary evolution of a composite system wherein two components interact in such a way as to become correlated (entangled) then consider the partial trace over one component system the density operator for the other system will appear to have evolved non-unitarily. Indeed it will have shown an entropy change. The whole system evolved unitarily and yet when you consider only part of the system you must use a non-unitary description of the part. This is by no means a "contradiction" or a mystery. The act of observing a system is an act of interacting with the system via an episystemic element not represented in the dynamics, the observer mechanism is by its nature correlated with the system (the physical record of the measurement is correlated (entangled) with the system itself.) What is more the nature of measurement is thermodynamic, there is of necessity entanglement of other variables with a heat dump.

There is no "gotcha!" contradiction in the disparate descriptions of unitary evolution between measurements and non-unitary description of measurements.
It looks like your logic can be used equally well to prove that there is no contradiction between classical mechanics and thermodynamics. Nevertheless, classical mechanics is reversible, and thermodynamics is not. You may say that this contradiction is not practically important, but this is still a contradiction. In the same way, unitary evolution cannot produce irreversibility or turn a pure state into a mixture, and the projection postulate does just that. A contradiction is a contradiction.
jambaugh said:
As far as rejecting local realism is concerned, there is a tendency to assume rejecting "reality" somehow a great leap when it is in fact the acceptance of reality which is the leap, the extraordinary assertion requiring extraordinary proof. Specifically the acceptance of the assumption of an objective state of reality.

The alternative is not a nihilistic unreality but rather a framework of material phenomena which does not require the platonic idealism of a universe of objects. Things actually happen whether we are here to see them or not. Since we are here we describe them as phenomena, formulate a theory of cause and effect, utilize probabilistic descriptions of outcomes given the impossiblity of our omniscience, and update our descriptions (discontinuously) when we update our knowledge (discontinuously) through observation...
and when phenomena occur in sufficiently large aggregate form we can build a functionally useful symbolic model we call objective reality. Indeed our brains evolved the means to imagine a state of reality because it was functionally useful for day to day interactions at our scale of experience. But it is hubris in the extreme to insist that the actuality around us must conform in its fundamental nature to the objective reality we model in our minds.

Start with the definition of science as an epistemological discipline and you will see what is fundamental, the empirical observation not the objective state. Now formulate a theory of nature based on this fundamental action, what we may observe and how observations correlate. Maximize our ability to represent knowledge, even partial knowledge by expressing probabilistic correlations between outcomes of measurements. Quantum mechanics is such a theory. There is no contradiction nor ambiguity of meaning in this framework.

It is only when one insists that we can go beyond science and give meaning to objective models beyond their link to scientific knowledge in the form of successful prediction that one finds contradiction. There is a fork in the road, one way leads to classical reality with its infinite precision and the other way leads to more accurate predictions of quantum actuality. Pick your path but do not insist that the roads must meet up further down the line.

The format of a theory which best fits observed phenomena and best holds to the principles of science is one of local actuality, QM with Copenhagen ["lack of further ontological"] Interpretation. Reality be damned!
I try to avoid discussing philosophy here – first, it requires a lot of time, second, it is not very appropriate here. Let me just note that this thread is not about local realism (LR) being correct or wrong, it’s about LR being or not being ruled out. I am not trying to convince anybody that there is reality out there. I am trying to argue that LR is logically and experimentally possible right now, however prevalent the opposite opinion is.
 
  • #775
Ilja said:
That's about a completely different issue - it is the "without a mediation" which is problematic there, not the speed of the mediation.
I tend to agree with harrylin’s comment (post 777 in this thread). In my book, mediation with infinite speed is no mediation.

Ilja said:
If you want to wait for loophole-free experimental tests of Bell inequalities - fine, I have no problem with this. In this point, I'm in agreement with the mainstream and don't wait myself for them, but that's clearly a personal choice.
If you don’t “want to wait for loophole-free experimental tests of Bell inequalities - fine, I have no problem with this.”:-)
Ilja said:
So what is worth to be discussed are only the consequences of violations of Bell inequalities. If you agree that in this case Einstein causality has to be given up, fine. If not, explain.
If loophole-free evidence of violations of the Bell inequalities appears tomorrow, I’ll certainly have to reconsider the entire situation. As I wrote in this thread though, I cannot promise that I won’t choose superdeterminism, for example:-). But I do think it is a bit early to speculate “what if”. We exist here and now, and I discuss the current situation in this thread. Mind you, I am not asking you about the consequences of possible absence of violations in loophole-free experiments:-)

Ilja said:
I think the majority does not like it because it destroys Lorentz covariance, which is highly valued by the majority. If one accepts that it is dead on the fundamental level, and survives only for observables, if Bell's inequalities are violated, then the situation looks quite different. Then the most non-beautiful element seems that the whole wave function is part of the ontology.
In my book, there is no positive evidence of violations. Let me note however that the majority both believes in violations and does not like de Broglie – Bohm interpretation (dBB), and this does not bode well for universal acceptance of dBB:-) But again, while I am not enthusiastic about dBB, I am not its enemy either. Furthermore, my results have applications to dBB.
 
  • #776
bohm2 said:
I'm sort of confused. Irrespective of Bell's, doesn't PBR rule out any Einstein-type psi-epistemic model? The PBR theorem does make a few assumptions (e.g. no superdeterminism, etc.). So what are the particulars why you disagree with Leifer's point here below regarding the implications of PBR? Do you think this PBR no-go theorem also has some unreasonable assumptions?

PBR, EPR, and all that jazz
http://www.aps.org/units/gqi/newsletters/upload/vol6num3.pdf

I am not sure but I suspect we have two different things in mind. Einstein causality means the speed of causal influence is limited. On the other hand, there is also EPR's reasoning, which I do not accept - for example, I don't question the uncertainty principle and am not enthusiastic about noncontextual hidden variables. One more thing. Let us consider the EPR-Bohm experiment. If initially we have two particles in a singlet state and then the spin projection of one of the particles is measured to be +1, the projection postulate demands that the two particles are in such a state that the spin projection of the first particle is +1, whereas that contradicts unitary evolution, so I have to reject either unitary evolution or the projection postulate as a precise law. My choice is to reject the projection postulate as a precise law.

If the above does not answer your question, let me know, and sorry about that - I did not try to study PBR in detail as I don't mind their conclusion:-)
 
  • #777
harrylin said:
Yes, I agree. All experiments that I looked into so far may be understood to work by exploiting loopholes (or even big holes). By now I regard a "loophole free" experiment in the same way as an experiment that "breaks the PoR" - I won't wait for it.

Thank you. Let me just note that it is a mainstream opinion that there has been no loophole-free experimental evidence of violations so far. I gave quotes confirming that in this and other threads.
 
  • #778
harrylin said:
There is a colossal difference between "greater than c" (which conforms with "LR") and infinite speed (which does not conform with "LR").
I agree, and thank you for the evidence that the naming "local realism" is misleading. What can be proven by loophole-free experiments is always only "greater than v", with some velocity v, which may be much greater c, but not infinite. But, once there are no interesting known theories with critical velocities v>c, the only interesting point is that such observations can rule out "greater than c". A maximal speed v < ∞ can never be ruled out by observation. Thus, your notion of LR is simply a wrong one - not the one used by the mainstream. But I agree that it is suggested by naming it local realism instead of Einstein-causal realism.

I searched but could not find a claim by Newton that the speed of gravity is infinite
I didn't even search, the point is that it is obvious. The gravitational force is defined by the positions of all other masses at the same moment of time.

I find Einstein's own formulation, "spooky action at a distance" much better: for it relates to the implausible infinite speed of QM as well as to the implausible infinite and unfailing working range of QM, independent of distance. Can the De Broglie's model account for that??
It contains it. As in Newtonian theory, the speed of a particle depends on the positions of all other particles of the universe.

jambaugh said:
Just because you can use (at the classical scale) a reality model as your framework of material phenomena doesn't mean it is the best/only framework.
Yes, but it is a very simple one, so Ockham's razor is waiting for the alternatives. And you should obtain this simple model in the classical limit (a criterion which rules out MWI and similar phantasies if taken seriously).

The point to understand is that there is a class of languages broader than nominative languages. You can use Whitehead's nomenclature of object language and process language, but I prefer to say action language. We use an object language classically but we can recognize quantum mechanics as a action language. The semantic atoms are acts rather than objects.
I would suggest to name this the "Orwellian interpretation" - changing the language so that one can no longer talk about reality :smile: Ok, you argue that it is a superset, that one can formally construct an object language on the top. But looking at your example, "running" as a noun, it is or artificial, or a triviality: I use "existing" as a noun.

What this boils down to (and where it is relevant to this topic) is that recognizing quantum mechanics as an action language we need no further interpretation when we interpret the symbolism in terms of phenomenological acts. "prepare system, dynamically evolve, selectively detect, measure", "compose actions" <--> "bra" "U" "ket", "X", "AB=C".
It is the person trapped in object language thinking that must insist that these actions only terminate in objective states.
Sorry, but I'm not trapped at all. I recognize very well that the minimal interpretation of QM is a useful instrumental device. Realism is, in my opinion, a restriction for physical theories, a restriction, which, as any other restriction (existence of a Lagrange formalism, conservation laws and so on) is useful because it adds predictive power.

That it really adds predictive power we have seen: Realism + Lorentz invariance gives Bell inequalities, without realism not.

The "reality" in QM is relativized. You can work in a perfectly valid classical logic of states if you restrict your actions to only include a commuting subset of observables. Changing to a distinct subset of observables defines an alternative transformed "reality frame", just as you transform between moving observers and get a different "now" frame in SR. Just as with SR the transition in thinking from absolute to relative is difficult and many become intransigent insisting it is just wrong because they can't let go of their intuition of absolutes.
Again, reality is relativized only in particular interpretations of QM. In dBB it exists absolutely in full beauty.

The preference for relativised approaches in modern physics is a nasty side effect of positivism. Once we cannot observe the absolutes, but observe only the relative things, positivists prefer relativistic theories, even if they usually are much more complicate.

Here we have to use Ockham's razor. The cost for relativizing absolute simultaneity is to introduce a whole spacetime into existence, where in classical theory future and past "exist" only in a much weaker sense, and it is sufficient to consider only what exists now as really existing.

In gauge theory the relativised existence becomes even more problematic, and you have to use factorspaces or artificial loop spaces already for the appropriate definition, where in the absolute version a simple gauge potential is sufficient.

And in gravity the problem of the relative approach becomes so complex that there is not even a quantum theory for it.

You see this in the various "disproofs of einstein" which demonstrate that we can get Einstein's predictions by overlaying a whole set of unobservable structure namely the aether.
Einstein rejected the aether because the predictive theory showed it to be fundamentally unobservable.
The "disproofs of Einstein" show something different: Uneducated people with normal intuitions consider the Lorentz ether as being much more reasonable. Being uneducated, they have only an intuitive feeling that the justification for the spacetime interpretation is wrong, and their arguments against it, partially provoced by the standard argumentation ("the preferred frame is unobservable", as if this really matters) appear nonsensical.

Reality qua reality is fundamentally unobservable. We observe through acts of measurement not immediate intimate clairvoyance into the state of reality as it is.
Correct. And this is, and should be, adequately described by the language. So the language has to represent reality as the fundamental hypothesis, and the actions (of measurement and so on) as derived, usually in a quite complex way, from these fundamentals.

This is how a realistic theory is presented. We start with defining what "is", what "exists". Only after this, we define the basic actions of these entities - how they move, how they interact. And only in the last step we care about the much more complex observations. And, what is essential, there is no step where we have to "derive" something from observations - the basic nonsense of positivism.

In imagining a birds eye view of distant phenomena we imagine and develop an intuition of absolute simultaneity when we must deduce simultaneity from the phenomena of causal signals.
I wrote it down, and in the next line you present an example. :smile: No, there is no necessity at all for deduction of scientific theories. That's positivism. Scientific theories are hypotheses, they are not and cannot be deduced from observations.

When in an action language actions fail to commute we loose the presumption that the quantities they change add.
I would say we loose a very reasonable criterion for distinguishing scientific theories from nonsense like astrology. What is the main difference between science and astrology? Ok, positivists tend to tell us that it is that the predictions of science are really accurate, but the predictions made by astrology are not. My grandmother possibly disagrees, and knows a lot of examples where astrological predictions have been successful. And I think we all remember a lot of examples where scientific statistics have miserably failed.

I nonetheless agree that, in the long run, it is decisive that scientists are able to present the better predictions. But I also think there is a reason for this - and this reason is the other, far more obvious difference between science and astrology. Science presents theories which not only predict, but also explain why.

You get Bell inequality violation which can only be resolved satisfactorily by rejecting the underlying absolute structure.
Feel free to prefer such mystical interpretations of QM, but the "only" is wrong - it is your free decision to reject the straightforward realistic interpretation of QM - dBB.

BTW, about the analogy between relativism vs. realism in SR vs. Lorentz ether and Kopenhagen vs. dBB we are in agreement. I would simply summarize this in another way: it is the same error of positivistic relativization.

What you must then learn to deal with is a limited "relative reality" and a whole lot more which doesn't fit into anyone reality frame just as in SR you have to deal with relative simultaneity and a whole lot of events that are neither unambiguously in the past nor in the future of a given event. In SR one ultimately stops thinking of "simultaneous" as meaningful and rather speaks of space-time separation (time-like, space-like, and light-like). In QM one should ultimately stop thinking of "real" as meaningful and rather speak of forbidden transitions, assured transitions and the in-between probabilistic transitions.
"Stop thinking" - the Orwellian approach to modern science. SCNR.



akhmeteli said:
If loophole-free evidence of violations of the Bell inequalities appears tomorrow, I’ll certainly have to reconsider the entire situation. As I wrote in this thread though, I cannot promise that I won’t choose superdeterminism, for example:-).
Superdeterminism is a stupid choice. If you would take it seriously, you could even reject a working FTL phone line as evidence against Einstein causality. With superdeterminism no falsification of Einstein causality is possible.

In other words, if you would accept an FTL phone line between Earth and Mars as a falsification of Einstein causality, you should reject superdeterminism.

But I do think it is a bit early to speculate “what if”. We exist here and now, and I discuss the current situation in this thread. Mind you, I am not asking you about the consequences of possible absence of violations in loophole-free experiments:-)
I don't think it is early. Last but not least, the only theory we have in the quantum domain predicts it. I would be happy if we find evidence violating quantum theory, that would open the door to subquantum theory. But there is not even a reasonable candidate for a general subquatum theory which would not predict a violation of Bell's inequalities but predict the observed outcome of all the experiments which have been already done.

Let me note however that the majority both believes in violations and does not like de Broglie – Bohm interpretation (dBB), and this does not bode well for universal acceptance of dBB:-)
That's indeed the great mystery of modern science.

At the current moment, I have only sociological explanations for this: Special relativity can be understood already in school, and those who start studying physics are, therefore, people fascinated by it. What they learn in the university does not change this. So relativity is already, from a sociological point of view, a belief as deep as religious beliefs, and experience tells us that people who change their religious beliefs are only rare exceptions.

One would hope, of course, that the situation in science is different. But there is yet another sociological problem with modern science - that it does no longer support independent thinking sociologically. In the past, scientists have been university teachers, and even without any scientific success they had a safe job as a teacher. Today they have short-time jobs connected with grants, and have to search for a new job every two-three years, and their changes of finding such a job depend on their accepted publications in mainstream journals. From a sociological point of view I would name this an extremal mainstream dependence.

Sorry for the off-topic, SCNR.
 
  • #779
Ilja said:
Superdeterminism is a stupid choice. If you would take it seriously, you could even reject a working FTL phone line as evidence against Einstein causality. With superdeterminism no falsification of Einstein causality is possible.

In other words, if you would accept an FTL phone line between Earth and Mars as a falsification of Einstein causality, you should reject superdeterminism.
I don’t need to defend superdeterminism. I just tried to give an honest answer to your question.

Ilja said:
I don't think it is early. Last but not least, the only theory we have in the quantum domain predicts it. I would be happy if we find evidence violating quantum theory, that would open the door to subquantum theory. But there is not even a reasonable candidate for a general subquatum theory which would not predict a violation of Bell's inequalities but predict the observed outcome of all the experiments which have been already done.

I am not sure standard quantum theory truly predicts violations, as to “predict” them, it uses its mutually contradicting components – unitary evolution and the theory of measurement (e.g., the projection postulate). That’s not what I call “prediction". I’d say this is another reason it may be too early to discuss the consequences of loophole-free violations – I think standard quantum theory should sort out its own problems first. Let me also note that LR theories of my work reproduce unitary evolution of quantum field theories.

Ilja said:
That's indeed the great mystery of modern science.

At the current moment, I have only sociological explanations for this: Special relativity can be understood already in school, and those who start studying physics are, therefore, people fascinated by it. What they learn in the university does not change this. So relativity is already, from a sociological point of view, a belief as deep as religious beliefs, and experience tells us that people who change their religious beliefs are only rare exceptions.

One would hope, of course, that the situation in science is different. But there is yet another sociological problem with modern science - that it does no longer support independent thinking sociologically. In the past, scientists have been university teachers, and even without any scientific success they had a safe job as a teacher. Today they have short-time jobs connected with grants, and have to search for a new job every two-three years, and their changes of finding such a job depend on their accepted publications in mainstream journals. From a sociological point of view I would name this an extremal mainstream dependence.

One can always find some sociological explanations, but we are still left with the fact that the majority does not think dBB is as “nice” as you think. Your conclusion seems to be that we need a better majority, I suspect we need a better dBB as well.
 
  • #780
jambaugh said:
[..] We observe through acts of measurement not immediate intimate clairvoyance into the state of reality as it is. This barrier is the same as the barrier in SR of observing distant events only through propagating intermediate causal phenomena (light or gravity or bouncing balls or something). In imagining a birds eye view of distant phenomena we imagine and develop an intuition of absolute simultaneity when we must deduce simultaneity from the phenomena of causal signals. Similarly we must deduce states of reality (at the classical scale) from the phenomena of measurements we make. When we push it to the extreme observables significantly fail to commute just as in SR the boosts fail to commute and we can no longer thing of velocities as additive (+ being a commuting product).
That has been discussed several times in the relativity forum. What you probably meant was that we may no longer confound velocity transformations with velocity differences. SR may seem like magic for some (as it used to for me), but in fact it contains no magic. QM still seems like magic to me, but if you can propose a non-magical interpretation of the related phenomena then I'm all ears. :smile:
When in an action language actions fail to commute we loose the presumption that the quantities they change add. In QM it is the probabilities that cease to be an additive measure over a state space. You get Bell inequality violation which can only be resolved satisfactorily by rejecting the underlying absolute structure. [..] In QM you loose the fiber-bundle structure of logical values over the base of possible states of reality. In both cases you get a relativity principle and a unified composite structure. In [..] QM it is a unified manifold of possibilities. In SR we see mixing of space and time as we change frames and in QM we see the mixing of "reality" and "logic" that becomes probability and is though of as quantum uncertainty. It is a mixing of "reality" and "possibility" which manifests as probability. [..]
The interpretation of mixing up conceptually different things has no appeal to me - that is just the kind of "magic" that I reject. For SR phenomena it is not necessary and I expect that it will also be found unnecessary for QM phenomena. At least it has not been experimentally disproved (insofar as I understand the experiments!).
in QM [..] the nature of the acts of observation are either intrinsically or as a matter of pragmatic necessity, intimate. The act of measurement is an interaction. The act of observing affects the observed as well as the observer.
Surely everyone agrees on that!
 
  • #781
Ilja said:
[..] I didn't even search, the point is that it is obvious. [..]
The contrary is for me obvious, because I deem Newton as having been sound of mind, based on what he did express; and since neither of us has proof of what Newton really thought on this matter, we only have our personal estimations about his thinking on this. :wink:
[De Broglie's theory] contains [the "implausible infinite and unfailing working range of QM, independent of distance"]. As in Newtonian theory, the speed of a particle depends on the positions of all other particles of the universe.
I'm afraid that you did not understand my question which is not about speed. The usual discussions are only about half of the "spookiness". How does De Broglie's theory explain that an action on a particle at one end of the universe can have an undiminished effect on another particle at the other end of the universe? What physical mechanism did he propose for that? Note that if it requires a long answer, I'll start it as a new topic. :-p
[..] I would suggest to name this the "Orwellian interpretation" - changing the language so that one can no longer talk about reality :smile: [..]
I just (finally) read 1984 - and it is even more applicable on some of these discussions than had imagined before I read it. Indeed, there is too much Newspeak going on.
 
  • #782
akhmeteli said:
I am not sure standard quantum theory truly predicts violations, as to “predict” them, it uses its mutually contradicting components – unitary evolution and the theory of measurement (e.g., the projection postulate). That’s not what I call “prediction".
I'm sure, because I use the dBB interpretation, and in the dBB interpretation there is no such contradiction.

The collapse of the wave function in dBB is described by the unitary evolution of the wave function of the object itself together with the apparatus, and the evolution of the object and the apparatus themself (by the guiding equation). One can combine the full wave function ψfull(o,a,t) with the trajectory of the apparatus a(t) to define an effective wave function of the object ψo(o,t) = ψfull(o,a(t),t). The evolution equation for this effective wave function is, during the measurement, not unitary, because unitary interaction holds only for closed systems or systems which at least are not interacting with their environment. Before and after the measurement, that means if there is no longer any interaction of o with something else, it is unitary. This easily follows from the unitary evolution for the full system.

One can always find some sociological explanations, but we are still left with the fact that the majority does not think dBB is as “nice” as you think. Your conclusion seems to be that we need a better majority, I suspect we need a better dBB as well.
There is room for improvement for the presentation of dBB - it is quite typical to use many particles, while, in the light of QFT, it would be much more reasonable to use a general configuration space, which can be, as well, a field.

But the main reason for not liking dBB is obvious - it is the strong belief into fundamental relativity. And here improvements are impossible - any realistic interpretation of QM has to violate fundamental relativity.

This is not a problem of physics - effective relativity is not a problem at all for dBB, the first model for the EM field was part of the first paper by Bohm. It is a problem of philosophy - the belief into fundamental relativity, or the spacetime interpretation, in comparison with effective relativity, which is compatible with the Lorentz ether.
 
  • #783
Ilja said:
I'm sure, because I use the dBB interpretation, and in the dBB interpretation there is no such contradiction.

The collapse of the wave function in dBB is described by the unitary evolution of the wave function of the object itself together with the apparatus, and the evolution of the object and the apparatus themself (by the guiding equation). One can combine the full wave function ψfull(o,a,t) with the trajectory of the apparatus a(t) to define an effective wave function of the object ψo(o,t) = ψfull(o,a(t),t). The evolution equation for this effective wave function is, during the measurement, not unitary, because unitary interaction holds only for closed systems or systems which at least are not interacting with their environment. Before and after the measurement, that means if there is no longer any interaction of o with something else, it is unitary. This easily follows from the unitary evolution for the full system.

As far as I know, it is impossible to prove violations in dBB without using some assumptions beyond unitary evolution, otherwise such a proof could be transferred to standard quantum theory. If you disagree, could you please give a reference to such a proof?


Ilja said:
There is room for improvement for the presentation of dBB - it is quite typical to use many particles, while, in the light of QFT, it would be much more reasonable to use a general configuration space, which can be, as well, a field.

But the main reason for not liking dBB is obvious - it is the strong belief into fundamental relativity. And here improvements are impossible - any realistic interpretation of QM has to violate fundamental relativity.

This is not a problem of physics - effective relativity is not a problem at all for dBB, the first model for the EM field was part of the first paper by Bohm. It is a problem of philosophy - the belief into fundamental relativity, or the spacetime interpretation, in comparison with effective relativity, which is compatible with the Lorentz ether.

I gave my reasons to think that fundamental relativity has not been ruled out - absence of loophole-free demonstrations of violations and absence of contradiction-free proof of violations in quantum theory.
 
  • #784
harrylin said:
The contrary is for me obvious, because I deem Newton as having been sound of mind, based on what he did express; and since neither of us has proof of what Newton really thought on this matter, we only have our personal estimations about his thinking on this. :wink:
My point was not about Newton's thinking, but about the equations. The link http://plato.stanford.edu/entries/Newton-philosophy/#ActDis has been already postet here and shows that Newton was aware that there is an action at a distance in the equations, and has considered the lack of mediation as a problem.

I'm afraid that you did not understand my question which is not about speed. The usual discussions are only about half of the "spookiness". How does De Broglie's theory explain that an action on a particle at one end of the universe can have an undiminished effect on another particle at the other end of the universe? What physical mechanism did he propose for that? Note that if it requires a long answer, I'll start it as a new topic. :-p
dBB does not give any answer, and does not even try to give one. So the situation is quite similar to Newtonian gravity, where the formulas do not tell us anything about an explanation for gravity.

And, similarly, I think this is an interesting open problem and can be a hint for developing some subquantum theories. A theory which, for example, restricts the maximum speed of this spooky action should violate quantum theory.

This would be a second hint for subquantum theory, the first being that QM fails for very small values of ψ. That's because around ψ(q)=0 the dBB velocity becomes infinite, even if only in a quite harmless way (increasingly fast rotation around the 0).
 
  • #785
Ilja said:
And, similarly, I think this is an interesting open problem and can be a hint for developing some subquantum theories. A theory which, for example, restricts the maximum speed of this spooky action should violate quantum theory.
Not sure if Gisin's experiment was posted in this thread but his group suggested that the speed of this non-local connection must be is at least 10,000 times the speed of light:
For instance, if such a privileged reference frame exists and is such that the Earth's speed in this frame is less than 10-3 that of the speed of light, then the speed of this spooky influence would have to exceed that of light by at least 4 orders of magnitude.
Testing spooky action at a distance
http://arxiv.org/pdf/0808.3316v1.pdf
 
Last edited:
  • #786
bohm2 said:
Not sure if Gisin's experiment was posted in this thread but his group suggested that the speed of this non-local connection must be is at least 10,000 times the speed of light:

Testing spooky action at a distance
http://arxiv.org/pdf/0808.3316v1.pdf
Nice find! I'll read it. :smile:
Note that your Arxiv link is a version of a reviewed publication:
http://www.nature.com/nature/journal/v454/n7206/full/nature07121.html

PS. a quick question: I quickly looked over it but I could not immediately "get" the idea behind it.
What is in a nutshell their method for determining the minimal speed of "spooky action at a distance? They mention two-photon interference, which sounds somewhat like MMX (even more like KTX). Where is "Bell" in all that? :confused:
 
Last edited by a moderator:
  • #787
bohm2 said:
Not sure if Gisin's experiment was posted in this thread but his group suggested that the speed of this non-local connection must be is at least 10,000 times the speed of light:

Testing spooky action at a distance
http://arxiv.org/pdf/0808.3316v1.pdf

Let me just note that the article in question does not claim simultaneous elimination of both the detection and the locality loopholes (probably, the detection loophole still exists, as is customary for all experiments with photons), so, strictly speaking, their experiment does not even demonstrate violations of the Bell inequalities.
 
  • #788
harrylin said:
Nice find! I'll read it. :smile:
Note that your Arxiv link is a version of a reviewed publication:
http://www.nature.com/nature/journal/v454/n7206/full/nature07121.html

PS. a quick question: I quickly looked over it but I could not immediately "get" the idea behind it.
What is in a nutshell their method for determining the minimal speed of "spooky action at a distance? They mention two-photon interference, which sounds somewhat like MMX (even more like KTX). Where is "Bell" in all that? :confused:

The idea is that if there is another, greater limiting speed, say 100c, then there has to be a corresponding superlight cone and there will be space-like separated event pairs for this superlight cone too. And for such event pairs the Bell inequalities should hold. So one has to test the violation of Bell inequalities for large enough sets of event pairs so that there will be no place for the 100 c superlight cone.

The next idea was that there is a reasonable hypothesis for the place of the superlight cone - one can guess that the time of the rest frame for the background radiation will be time-like in the superlight cone too. So there is no need to rule out all those skew superlight cones, and all one needs is to care about the much smaller set of superlight cones compatible with the background radiation rest frame. So, one only has to look for event pairs which have approximately equal time in the CMBR frame.
 
Last edited by a moderator:
  • #789
Ilja said:
The idea is that if there is another, greater limiting speed, say 100c, then there has to be a corresponding superlight cone and there will be space-like separated event pairs for this superlight cone too. And for such event pairs the Bell inequalities should hold. So one has to test the violation of Bell inequalities for large enough sets of event pairs so that there will be no place for the 100 c superlight cone.

The next idea was that there is a reasonable hypothesis for the place of the superlight cone - one can guess that the time of the rest frame for the background radiation will be time-like in the superlight cone too. [..]
Ok, thanks - it's starting to dawn on me now. :smile:
I have the impression that in that last paper they tried to be so general as not to need the CMBR hypothesis.

Anyway, I now find my hunch that such an interferometer experiment is unlikely to yield anything "spooky" supported by an old discussion on this forum (keyword I searched with was "Franson"):

https://www.physicsforums.com/showthread.php?t=229393

However, there is for me not enough explanation there ...
 
Last edited:
  • #790
This has been posted here before but was just published in Nature Physics. It argues that barring loopholes, if the non-local effects observed in Bell-type experiments propagate at any finite speed, then non-locality could be exploited for superluminal communication:
The new hidden influence inequality shows that the get-out won't work when it comes to quantum predictions. To derive their inequality, which sets up a measurement of entanglement between four particles, the researchers considered what behaviours are possible for four particles that are connected by influences that stay hidden and that travel at some arbitrary finite speed. Mathematically (and mind-bogglingly), these constraints define an 80-dimensional object. The testable hidden influence inequality is the boundary of the shadow this 80-dimensional shape casts in 44 dimensions. The researchers showed that quantum predictions can lie outside this boundary, which means they are going against one of the assumptions. Outside the boundary, either the influences can't stay hidden, or they must have infinite speed.
Looking Beyond Space and Time to Cope With Quantum Theory
http://www.sciencedaily.com/releases/2012/10/121028142217.htm

Quantum non-locality based on finite-speed causal influences leads to superluminal signalling
http://www.nature.com/nphys/journal/vaop/ncurrent/full/nphys2460.html

Full article posted in arxiv:
http://arxiv.org/pdf/1110.3795v1.pdf
 
Last edited by a moderator:
  • #791
I know this isn't likely to sway any opinions but for completion and future reference this is a follow-up piece by Gisin to the J. D. Bancal et al. Nature Physics paper linked above that was just posted on arxiv:
We investigate possible explanations of quantum correlations that satisfy the principle of continuity, which states that everything propagates gradually and continuously through space and time. In particular, following [J.D. Bancal et al, Nature Physics 2012], we show that any combination of local common causes and direct causes satisfying this principle, i.e. propagating at any finite speed, leads to signalling. This is true even if the common and direct causes are allowed to propagate at a supraluminal-but-finite speed defined in a Newtonian-like privileged universal reference frame. Consequently, either there is supraluminal communication or the conclusion that Nature is nonlocal (i.e. discontinuous) is unavoidable.
Quantum correlations in Newtonian space and time: arbitrarily fast communication or nonlocality
http://lanl.arxiv.org/pdf/1210.7308.pdf
 
Last edited:
  • #792
Lecture from lead author JD Bancal from Perimeter Institute:
The experimental violation of Bell inequalities using spacelike separated measurements precludes the explanation of quantum correlations through causal influences propagating at subluminal speed. Yet, it is always possible, in principle, to explain such experimental violations through models based on hidden influences propagating at a finite speed v>c, provided v is large enough. Here, we show that for any finite speed v>c, such models predict correlations that can be exploited for faster-than-light communication. This superluminal communication does not require access to any hidden physical quantities, but only the manipulation of measurement devices at the level of our present-day description of quantum experiments. Hence, assuming the impossibility of using quantum non-locality for superluminal communication, we exclude any possible explanation of quantum correlations in term of finite-speed influences.
http://pirsa.org/displayFlash.php?id=11110145
 
  • #793
  • #794
ZapperZ said:
The point here is that this thread appears to indicate that even IF all the loopholes are closed (and I will make MY prediction here that in the near future, say within 3 years, ALL the loopholes will be closed in one single experiment), the intrinsic nature of the theory will STILL not falsify local realism.

Three years have passed. As far as I know, all the loopholes have not been closed in one single experiment. For example, reporting some further progress in a recent article http://arxiv.org/abs/1212.0533 , Zeilinger e.a. still admit that “The realization of an experiment that is free of all three assumptions – a so-called loophole-free Bell test – remains an important outstanding goal for the physics community”.

I am writing this without any Schadenfreude. I do appreciate that a loophole-free experiment can be performed any moment now. It looks like the race to conduct the first experiment of this kind is really fierce. E.g., the following quote is interesting (SCIENCE, VOL 331, P. 1380 (2011)): “Zukowski thinks the race to close all the loopholes simultaneously will soon be over. “Conservatively, it could take another 5 years to complete, but it could also be done tomorrow,” he says. “We’re at the stage where everyone is scared to read their competitors’ papers, in case they find they have been beaten. The only real question is: Who will win?””

I also had this impression of a fierce race listening to talks on quantum foundations experiments at several conferences last year. On the other hand, some experimentalists admitted (typically, not in their official talks:-) ) that they encounter some formidable challenges.

So I am just trying to say that these three years since the start of this thread have demonstrated again that it is extremely difficult to demonstrate violations of the genuine Bell inequalities. Will they be demonstrated by the fiftieth anniversary of the Bell’s article next year? Or ever? My prediction is “no”. But I may be mistaken.
 
  • #795
Gordon Watson said:
The point is that particles in a singlet state have, both theoretically and experimentally, a higher correlation than you seem to allow (or expect) in your work.

I guess you're just bashing your response out, so this is not intended to be a substantive criticism, but it's not a correlation that's higher, it's a sum of absolute values of a sum and a difference between four different correlations $|A-B|+|C+D|$ (at least, that's what it is in the CHSH formulation, in no case is the difficulty for local realism that a correlation coefficient simpliciter is too high).
 
  • #796
Gordon Watson said:
The point is that particles in a singlet state deliver, both theoretically and experimentally, a higher expectation value* than you seem to allow (or expect) in your work.

Dear Gordon Watson,

I tried to explain in my post 753 in this thread why I cannot agree with you.
 
  • #797
Gordon Watson said:
OK; disagreeing with me is no big deal. BUT I'm NOT aware of any rational quantum physicist that agrees with you:

"... that there are some reasons to believe Bell inequalities cannot be violated either in experiments or in quantum theory." [Cited above.]​

So Santos, Marshall, nightlight are not rational quantum physicists, in your book. They are in mine. Let me add that I gave those "reasons" and properly published them (although I just repeated other people's arguments, as far as those "reasons" are concerned).

Another thing: "the foundations of quantum mechanics ... remain hotly debated in the scientific community, and no consensus on essential questions has been reached." (Schlosshauer, Kofler, Zeilinger, http://arxiv.org/abs/1301.1069 ). So disagreement is a "new normal" for quantum foundations.
 
  • #798
Gordon Watson said:
Using your terms to conclude re my position (vis-a-vis yours), I am satisfied that:

1. Bell inequalities are repeatedly violated by experiment.

2. Bell inequalities are certainly violated by quantum theory.

3. Except for their motivation toward better experiments, the remaining loopholes are of no consequence.
..
Agree. 1 and 2 are facts, and 3 seems to me to be a much more reasonable position than that taken by the loophole people.
 
  • #799
Gordon Watson said:
Using your terms to conclude re my position (vis-a-vis yours), I am satisfied that:

You are satisfied, I'm not.

Gordon Watson said:
1. Bell inequalities are repeatedly violated by experiment.

Not unless you ignore the loopholes.

Gordon Watson said:
2. Bell inequalities are certainly violated by quantum theory.

Not unless you use as assumptions mutually contradictory postulates of standard quantum theory, such as unitary evolution and the projection postulate. However, if you use mutually contradictory assumptions, you can get any conclusion, however absurd.

Gordon Watson said:
3. Except for their motivation toward better experiments, the remaining loopholes are of no consequence.

This is an opinion, not a fact.
 
  • #800
nanosiborg said:
[..] position [..] taken by the loophole people.
Who are such "loophole people"? It is suggestive of people who stick to an opinion against all odds, and I would be surprised if anyone here identifies with such a position - in which case it's just a strawman (it's a derogative term, used to indicate a means of escape or evasion).
 
Last edited:
  • #801
harrylin said:
Who are such "loophole people"?
People who think that a loophole-free test will change the current situation, which is that qm predictions are in line with results and violate BI, and lhv predictions are not in line with results.
 
  • #802
nanosiborg said:
People who think that a loophole-free test will change the current situation, which is that qm predictions are in line with results and violate BI, and lhv predictions are not in line with results.

Whether I am one of those "loophole people" or not, I respectfully disagree with your assessment of the current situation, e.g., with the following phrase: "lhv predictions are not in line with results." As long as there are loopholes in experiments (and there have been no loophole-free experiments so far), the results of the experiments cannot rule out all lhv theories, so at least some lhv theories' predictions are in line with the results.
 
  • #803
akhmeteli said:
As long as there are loopholes in experiments (and there have been no loophole-free experiments so far), the results of the experiments cannot rule out all lhv theories ...
Strictly speaking, this is correct. But I think the evidence is overwhelming that if a loophole-free test were done, then qm would correctly predict the results and lhv would not.

akhmeteli said:
... so at least some lhv theories' predictions are in line with the results.
Assumptions are required because of the inability to close all loopholes in the same test. So far, given the (reasonable, imo) assumptions used by the testers, qm agrees with experiment and lhv doesn't.

The incompatibility between qm and lhv has been mathematically proven. They necessarily predict a different correlation between θ and rate of coincidental detection. So, if qm is correct, then (Bell) lhv models of quantum entanglement are ruled out.

I'm betting that qm will continue to be confirmed, even in a loophole-free test.
 
  • #804
nanosiborg said:
Strictly speaking, this is correct. But I think the evidence is overwhelming that if a loophole-free test were done, then qm would correctly predict the results and lhv would not.

This is an opinion, not a fact. Somebody believes the evidence is overwhelming, somebody believes there is no evidence, as there have been no loophole-free experiments. How do you like the following "overwhelming evidence" that planar Euclidean geometry (PEG) is wrong (I already offered it in this thread)? PEG predicts that the sum of angles of any triangle is 180 degrees, whereas experiments demonstrate with high confidence that the sum of angles of a quadrangle and the sum of angles of a triangle on a sphere are not equal to 180 degrees. The obvious "loopholes" will certainly be closed simultaneously in future experiments:-)

nanosiborg said:
Assumptions are required because of the inability to close all loopholes in the same test. So far, given the (reasonable, imo) assumptions used by the testers, qm agrees with experiment and lhv doesn't.

My question is: what assumption is more reasonable: local realism or, say, fair sampling? Apparently, you'd vote for the latter one, I would vote for the former one. So who's right? I believe so far this is just a matter of opinion.

nanosiborg said:
The incompatibility between qm and lhv has been mathematically proven. They necessarily predict a different correlation between θ and rate of coincidental detection. So, if qm is correct, then (Bell) lhv models of quantum entanglement are ruled out.

I agree, the Bell theorem proves incompatibility between standard quantum theory and local realism. I argue though that this is not a problem for local realism, as, strictly speaking, standard quantum theory is incompatible with itself (I have in mind the notorious problem of measurements in quantum theory), so, strictly speaking, it cannot be completely correct. To prove incompatibility of standard quantum theory and local realism, you need to prove that the Bell inequalities can be violated in quantum theory. To this end, you need to use two mutually contradictory postulates of standard quantum theory: unitary evolution and, say, the projection postulate.


nanosiborg said:
I'm betting that qm will continue to be confirmed, even in a loophole-free test.

Strictly speaking, this phrase just attests to the strength of your opinion, not to its correctness. I don't believe local realism will be ruled out in loophole-free experiments, but again, this is just my opinion, not a fact. Maybe we should just wait and see.
 
  • #805
Gordon Watson said:
Andy, While we're waiting, please, would you mind spelling out what you mean by "local realism"?

Accepting that by "local" you mean "Einstein-local", maybe we could just focus on what "realism" means to you, please?

Locality, in my book, means that no effect can have its cause anywhere beyond its past light-cone. I guess this is what you call "Einstein-local".

Realism, in my book, is not the realism of the EPR article, i.e. I don't believe that, say, a particle has definite values of coordinates, momentum, spin projections, etc., whether the relevant observables are measured or not. You may say that I do not seek noncontextual hidden variables. The version of realism that I accept is contextual: any state can be described by some set of parameters that uniquely define the state's evolution. I would not call those parameters "hidden variables", as, say, in the models of my articles, they are not exactly hidden: they are the potentials of the electromagnetic fields and their derivatives. This version is "contextual" as the set of parameters must describe the relevant instruments as well.
 
Back
Top