The Time Symmetry Debate in Quantum Theory

  • Thread starter TrickyDicky
  • Start date
  • Tags
    Quantum
In summary: Quantum Mechanics is incomplete in the sense that it lacks a more complete description although this does not preclude its success.
  • #106
Jano L. said:
I think it will be easier to explain this by example. Consider random process that produces series of red points somewhere in a unit disk with uniform probability density. The probability of the event that the next point will concide with any point A of the disk is equal to 0.

However, after the event occurs, some point of the disk will be red. At that instant, an event with probability 0 has happened.

Actually, all events that happen in such random process are events that have probability 0.

So "event has probability 0" does not mean "impossible event".

Similarly, "probability 1" does not mean "certain event". Consider probability that the red point will land at point with both coordinates irrational. This can be shown to be equal to 1 in standard measure theory. However, there is still infinity of points that have rational coordinates, and these can happen - they are part of the disk.

In the language of abstract theory, all this is just a manifestation of the fact that equal measures do not imply that the sets are equal.

lugita15 said:
It's true that if an event will definitely occur, then it must have probability 1. But it's not the case that if an event has probability 1, it will definitely occur. See this wikipedia page.

Good points that simply go to support Jano L. posts #71, #74, #78... IMO showing that Bill's reliance on Gleason's theorem can not be used in the general case for what he thinks it can, but only for discretized, lattice models of physical systems, a very strong assumption in the light of what we know, or at least I think most physicists still favor a continuous picture of nature as exemplified by successful theories like GR.
bhobba said:
What do you mean by devised? Historically - probably - but so?
Probably no, certainly, you just have to read Born's original 1926 paper.


bhobba said:
We now know it follows from much more general considerations having nothing to do with particles eg Gleason's theorem.
I wouldn't be so sure we know that. See above.
 
Physics news on Phys.org
  • #107
Jano L. said:
I think it will be easier to explain this by example. Consider random process that produces series of red points somewhere in a unit disk with uniform probability density. The probability of the event that the next point will concide with any point A of the disk is equal to 0.

However, after the event occurs, some point of the disk will be red. At that instant, an event with probability 0 has happened.

Actually, all events that happen in such random process are events that have probability 0.

So "event has probability 0" does not mean "impossible event".

That's certainly true, mathematically. On the other hand, in the real world, we never measure real-valued observables to infinite precision. We never really observe: "The particle's momentum is P", we observe something like "The particle's momentum is somewhere in the range [itex]P\ \frac{+}{-} \ \Delta P[/itex]. For this reason, if we have two states [itex]| \psi \rangle [/itex] and [itex]|\phi \rangle[/itex] such that [itex]\langle \psi | \phi \rangle = 1[/itex], they are considered the same state, as far as quantum mechanics is concerned. Adding or subtracting a set of measure zero does nothing.
 
  • #108
Last edited by a moderator:
  • #109
That's certainly true, mathematically. On the other hand, in the real world, we never measure real-valued observables to infinite precision. We never really observe: "The particle's momentum is P", we observe something like "The particle's momentum is somewhere in the range P +− ΔP.
Yes, but we really discussed theoretical difference between probabilistic and deterministic description. I think the limitations of observations have no bearing on that argument.

...if we have two states |ψ⟩ and |ϕ⟩ such that ⟨ψ|ϕ⟩=1, they are considered the same state, as far as quantum mechanics is concerned. Adding or subtracting a set of measure zero does nothing.

It does nothing to probability. That was my point - in probabilistic theory, the probability is 1 both for certain and almost certain event. We cannot adequately describe the difference between the two in such a theory. Ergo deterministic theory is not just a special case of probabilistic theory. They are different kinds of theories constructed for different purposes.
 
  • #110
T0mr said:
The electron in orbit was given as an example of a possible path. I have read about this argument before. That an classically orbiting electron should emit radiation presumably because it is a charged object and an accelerated charged object (changing direction) will emit electromagnetic radiation. Yet if you were to put opposite charge on two spheres, one light and one heavy, and then set the lighter in orbit (in space) around the heavier would the two spheres not act just as the two body problem for gravitational force.

... but electromagnetism is 10^39 times stronger than gravitation ...

I think the Bohr model is pretty dead, there are incompatibilities to empirical spectral lines, and it also violates the uncertainty principle, and even if you magically could fix all that – where is your single localized particle in the double-slit experiment?

It doesn’t work...
 
  • #111
bhobba said:
This is in fact the defining property of an inertial frame - the Earth isn't exactly inertial but for many practical purposes such as this experiment it is.

I’m sorry bhobba, I’m completely lost... are you saying that the inertial frame of Earth has anything to do with the double-slit experiment??
 
  • #112
bhobba said:
QM does not insist on analogies with a classical particle model. All it assumes is position is an observable - which is a fact.

bhobba said:
In QM the symmetries are in the quantum state and observables - in classical mechanics its in the Lagrangian.

stevendaryl said:
I don't see that, at all. To me, the "measurement problem" is the conceptual difficulty that on the one hand, a measurement has an abstract role in the axioms of quantum mechanics, as obtaining an eigenvalue of a self-adjoint linear operator, and it has a physical/pragmatic/empirical role in actual experiments as a procedure performed using equipment. What is the relationship between these two notions of measurement? The axioms of quantum mechanics don't make it clear.

I don't see that it has anything particularly to do with particles.

TrickyDicky said:
Well if you reduce the discussion to abstract observables without attributing them to any particular object be it a particle, a field or whatever, you don't have a way to connect it with the physical/pragmatic side so no measurement problem for you, but as Ballentine warned in the quote posted by devil then you don't really have a physical theory but just a set of abstract axioms without connection with experiment.

stevendaryl said:
I think that's what I intended to say: that the "measurement problem" is about the connection between the notion of "observable" that is a primitive in the quantum theory, and the "observable" that is something that requires a measurement apparatus and a measurement procedure. But I don't see how that supports the claim that the measurement problem has anything to do, intrinsically, with classical properties of particles.

Jano L. said:
We should really distinguish these two rules. The first one is easy and does not depend on the measurement problem, and is gauge-invariant.

The second is difficult to understand, because it is connected to measurements and is gauge-dependent - if we choose different gauge to calculate ##\psi##, we get different ##p_k##.


Guys, it’s very interesting to read this discussion, and this stuff is always hard to talk about. Still, let me give you something to chew on while the ‘battle’ continues. :wink:

http://arxiv.org/abs/0707.0401
J.S. Bell's Concept of Local Causality said:
“The beables of the theory are those elements which might correspond to elements of reality, to things which exist. Their existence does not depend on ‘observation’. Indeed observation and observers must be made out of beables.”

Or as he explains elsewhere,

“The concept of ‘observable’ ... is a rather woolly concept. It is not easy to identify precisely which physical processes are to be given the status of ‘observations’ and which are to be relegated to the limbo between one observation and another. So it could be hoped that some increase in precision might be possible by concentration on the beables ... because they are there.”

Bell’s reservations here (about the concept “observable” appearing in the fundamental formulation of allegedly fundamental theories) are closely related to the so-called “measurement problem” of orthodox quantum mechanics, which Bell encapsulated by remarking that the orthodox theory is “unprofessionally vague and ambiguous” in so far as its fundamental dynamics is expressed in terms of “words which, however legitimate and necessary in application, have no place in a formulation with any pretension to physical precision” – such words as “system, apparatus, environment, microscopic, macroscopic, reversible, irreversible, observable, information, measurement.” As Bell elaborates,

“The concepts ‘system’, ‘apparatus’, ‘environment’, immediately imply an artificial division of the world, and an intention to neglect, or take only schematic account of, the interaction across the split. The notions of ‘microscopic’ and ‘macroscopic’ defy precise definition. So also do the notions of ‘reversible’ and ‘irreversible’. Einstein said that it is theory which decides what is ‘observable’. I think he was right – ‘observable’ is a complicated and theory-laden business. Then the notion should not appear in the formulation of fundamental theory.”

As Bell points out, even Bohr (a convenient personification of skepticism regarding the physical reality of unobservable microscopic phenomena) recognizes certain things (for example, the directly perceivable states of a classical measuring apparatus) as unambiguously real, i.e., as beables.

[...]

The unprofessional vagueness and ambiguity of orthodox quantum theory, then, is related to the fact that its formulation presupposes these (classical, macroscopic) beables, but fails to provide clear mathematical laws to describe them. As Bell explains,

“The kinematics of the world, in [the] orthodox picture, is given by a wavefunction ... for the quantum part, and classical variables – variables which have values – for the classical part... [with the classical variables being] somehow macroscopic. This is not spelled out very explicitly. The dynamics is not very precisely formulated either. It includes a Schrödinger equation for the quantum part, and some sort of classical mechanics for the classical part, and ‘collapse’ recipes for their interaction.”

There are thus two related problems. First, the posited ontology is rather different on the two sides of (what Bell calls) “the shifty split” – that is, the division between “the quantum part” and “the classical part.” But then, as a whole, the posited ontology remains unavoidably vague so long as the split remains shifty – i.e., so long as the dividing line between the macroscopic and microscopic remains undefined. And second, the interaction across the split is problematic. Not only is the account of this dynamics (the “collapse” process) inherently bound up in concepts from Bell’s list of dubious terms, but the very existence of a special dynamics for the interaction seems to imply inconsistencies with the dynamics already posited for the two realms separately. As Bell summarizes,

“I think there are professional problems [with quantum mechanics]. That is to say, I’m a professional theoretical physicist and I would like to make a clean theory. And when I look at quantum mechanics I see that it’s a dirty theory. The formulations of quantum mechanics that you find in the books involve dividing the world into an observer and an observed, and you are not told where that division comes... So you have a theory which is fundamentally ambiguous...”

The point of all this is to clarify the sort of theory Bell had in mind as satisfying the relevant standards of professionalism in physics.

Don’t know why I love this paper, but I do – it’s ‘crisp & clear’...
 
  • #113
DevilsAvocado said:
I’m sorry bhobba, I’m completely lost... are you saying that the inertial frame of Earth has anything to do with the double-slit experiment??

It has nothing to do with it per se.

My comment was in relation to the claim the measurement problem had something to do with QM holding the particle picture as fundamental. QM doesn't do that - the dynamics are, just like Classical Mechanics, determined by symmetry arguments. There is no particle assumption other than position is an observable which is an experimentally verified fact.

For many practical purposes the Earth can be considered to have these symmetry properties - that was my point.

Thanks
Bill
 
  • #114
Jano L. said:
Consider probability that the red point will land at point with both coordinates irrational.

Well since the rationals have Lebesque measure zero and there is no way to observationally tell the difference between a rational and rational point, since that would require an infinite measurement precision, it's not a well defined problem physically.

If you seriously doubt a probability of 1 does not mean a dead cert then I think this thread is not the appropriate place to discuss it. I think the Set Theory, Logic, Probability and Statistics statistics subforum is more appropriate so I will be doing a post there.

Thanks
Bill
 
  • #115
TrickyDicky said:
Gleason's theorem can not be used in the general case for what he thinks it can, but only for discretized, lattice models of physical systems, a very strong assumption in the light of what we know, or at least I think most physicists still favor a continuous picture of nature as exemplified by successful theories like GR.

Gleason's theorem holds for infinite dimensional Hilbert spaces:
http://kof.physto.se/theses/helena-master.pdf

I have zero idea why you would think otherwise.

It even holds for non-separable spaces - not that that is of any value to QM.

The issue with Gleason's theorem is its physical basis is a bit unclear - mathematically what's going on is well understood, it depends on non contextuality, and, again mathematically, contextuality is a bit of an ugly kludge, but exactly, from a physical point of view why you require it is open to debate. This is the exact out Bohmian Mechanics uses and its a valid theory. But the Hilbert space formalism is ugly if you don't assume it - you can't define a unique probability measure so the question is - what use is using a Hilbert space to begin with - and indeed for BM the usual formulation is secondary in that interpretation.

My point is Born's rule is not dependent on a particle model - its basis is non-contextually in the usual formulation, or specific assumptions in other formulations like BM.

Thanks
Bill
 
  • #116
stevendaryl said:
Adding or subtracting a set of measure zero does nothing.

Exactly. This is bog standard stuff from more advanced probability texts that take a rigorous approach. Finding probabilities associated with determining rational or irrational numbers is not a well defined problem since the rationals have Lebesque measure zero.

I think a discussion on exactly what probability 0 and 1 means is best dome on the probability subforum and I will do a post there.

Thanks
Bill
 
  • #117
DevilsAvocado said:
Guys, it’s very interesting to read this discussion, and this stuff is always hard to talk about. Still, let me give you something to chew on while the ‘battle’ continues. :wink:

http://arxiv.org/abs/0707.0401


Don’t know why I love this paper, but I do – it’s ‘crisp & clear’...

Because it brings us Bell in his deep and intelligent own words, contrary to the tradition of misinterpreting him that abounds in QM literature :devil:.
:smile:
 
  • #118
bhobba said:
The issue with Gleason's theorem is its physical basis is a bit unclear - mathematically what's going on is well understood, it depends on non contextuality, and, again mathematically, contextuality is a bit of an ugly kludge, but exactly, from a physical point of view why you require it is open to debate. This is the exact out Bohmian Mechanics uses and its a valid theory. But the Hilbert space formalism is ugly if you don't assume it - you can't define a unique probability measure so the question is - what use is using a Hilbert space to begin with - and indeed for BM the usual formulation is secondary in that interpretation.

My point is Born's rule is not dependent on a particle model - its basis is non-contextually in the usual formulation, or specific assumptions in other formulations like BM.

Thanks
Bill
Bill, I agree with the quoted part.
Non-contextuality is a strong assumption IMO. But yes it makes the Hilbert formalism "ugly" not to adopt it. But Gleason's theorem assumes non-contextuality and that was the sense of my comment about lack of generality of the theorem as there are QM interpretations that don't assume non-contextuality (you mentioned BM but there are also the modal interpretations and others).
I have a doubt about this because I've seen quantum non-contextuality defined in two ways that I guess are equivalent, maybe you can help me connect them: as referred to independence of the measurement arrangement and as basis independence of the probability assigned to a vector.
 
  • #119
I think a discussion on exactly what probability 0 and 1 means is best dome on the probability subforum and I will do a post there.

I am looking forward to it. However, the argument was about something different: that deterministic theory is a special kind of probabilistic theory. I am quite interested what others think about this.
 
  • #120
TrickyDicky said:
Bill, I agree with the quoted part.
Non-contextuality is a strong assumption IMO. But yes it makes the Hilbert formalism "ugly" not to adopt it. But Gleason's theorem assumes non-contextuality and that was the sense of my comment about lack of generality of the theorem as there are QM interpretations that don't assume non-contextuality (you mentioned BM but there are also the modal interpretations and others).
I have a doubt about this because I've seen quantum non-contextuality defined in two ways that I guess are equivalent, maybe you can help me connect them: as referred to independence of the measurement arrangement and as basis independence of the probability assigned to a vector.


http://arxiv.org/pdf/1207.1952v1.pdf

..."The concept of contextuality states that the outcomes of measurement may depend on what measurements are performed alongside"...
 
  • #121
TrickyDicky said:
Bill, I agree with the quoted part.
Non-contextuality is a strong assumption IMO. But yes it makes the Hilbert formalism "ugly" not to adopt it. But Gleason's theorem assumes non-contextuality and that was the sense of my comment about lack of generality of the theorem as there are QM interpretations that don't assume non-contextuality (you mentioned BM but there are also the modal interpretations and others).

I did a Google search on the phrase "non-contextuality" and although it gets many hits, I still don't really understand what it means. Can someone give a real definition, and briefly explain why it's relevant in interpretations of quantum mechanics?
 
  • #122
stevendaryl said:
I did a Google search on the phrase "non-contextuality" and although it gets many hits, I still don't really understand what it means. Can someone give a real definition, and briefly explain why it's relevant in interpretations of quantum mechanics?

conjugates values that does not depend on the context, that is independent of its antecedents.

http://plato.stanford.edu/entries/kochen-specker/https://www.physicsforums.com/showpost.php?p=4401195&postcount=12

audioloop said:
really, you have to define locality/non-locality just like subsets of contextuality/noncontextuality,
contextuality is broader, subsumes locality/nonlocality.
that every state that is contextual with respect to the defined test of contextuality is nonlocal as per the CHSH (Clauser, Horne, Shimony, Holt test) but the converse is not true, or as i like to ask:

Every state that is contextual is nonlocal.
...and the inverse, is every state that is nonlocal is contextual ?-----
measured values (atributtes, characteristics, properties) in context, just is related to that, "context", be real goes beyond properties, the possibility of values requires pre-existent objects or process, without objects, there is no possibility of properties (values).
 
  • #123
audioloop said:
conjugates values that does not depend on the context, that is independent of its antecedents.

http://plato.stanford.edu/entries/kochen-specker/


https://www.physicsforums.com/showpost.php?p=4401195&postcount=12

That leaves me a little puzzled, still. Maybe somebody can give examples of a toy model that is contextual?

What I do understand is this: The argument that it is impossible for a hidden-variables theory that reproduces the predictions of quantum mechanics in an EPR-type twin particle experiment assumes that the hidden variable is unaffected by choices made at the measuring devices. But such an effect is ruled out by locality, so I don't see why "contextuality" matters in that case.
 
  • #124
stevendaryl said:
That leaves me a little puzzled, still. Maybe somebody can give examples of a toy model that is contextual?

What I do understand is this: The argument that it is impossible for a hidden-variables theory that reproduces the predictions of quantum mechanics in an EPR-type twin particle experiment assumes that the hidden variable is unaffected by choices made at the measuring devices. But such an effect is ruled out by locality, so I don't see why "contextuality" matters in that case.

An example of a contextual model can be seen if a context is considered to reside in the future. If Alice and Bob can signal from the future to the past as to what they are planning to measure, then entangled state correlations are easier to explain. Nothing needs to propagate faster than c for such mechanism to operate (and to properly describe any existing experiments).

So here we have locality respected while non-contextuality is not, which is essentially what you are looking for. Most contextual models seem "strange" as in counter-intuitive. I personally don't see them as any stranger than non-local ones.
 
  • #125
stevendaryl said:
That leaves me a little puzzled, still. Maybe somebody can give examples of a toy model that is contextual?

Spekkens Toy Model take the epistemic view and has local and non-contextual variables (= fails to reproduce violations of Bell inequalities), here’s a short introduction and here’s the arXiv paper.

Jan-Åke Larsson has made a contextual extension of this toy model:

http://arxiv.org/abs/1111.3561
A contextual extension of Spekkens' toy model said:
Quantum systems show contextuality. More precisely, it is impossible to reproduce the quantum-mechanical predictions using a non-contextual realist model, i.e., a model where the outcome of one measurement is independent of the choice of compatible measurements performed in the measurement context. There has been several attempts to quantify the amount of contextuality for specific quantum systems, for example, in the number of rays needed in a KS proof, or the number of terms in certain inequalities, or in the violation, noise sensitivity, and other measures. This paper is about another approach: to use a simple contextual model that reproduces the quantum-mechanical contextual behaviour, but not necessarily all quantum predictions. The amount of contextuality can then be quantified in terms of additional resources needed as compared with a similar model without contextuality. In this case the contextual model needs to keep track of the context used, so the appropriate measure would be memory. Another way to view this is as a memory requirement to be able to reproduce quantum contextuality in a realist model. The model we will use can be viewed as an extension of Spekkens' toy model [Phys. Rev. A 75, 032110 (2007)], and the relation is studied in some detail. To reproduce the quantum predictions for the Peres-Mermin square, the memory requirement is more than one bit in addition to the memory used for the individual outcomes in the corresponding noncontextual model.

stevendaryl said:
What I do understand is this: The argument that it is impossible for a hidden-variables theory that reproduces the predictions of quantum mechanics in an EPR-type twin particle experiment assumes that the hidden variable is unaffected by choices made at the measuring devices. But such an effect is ruled out by locality, so I don't see why "contextuality" matters in that case.

I agree, I always thought that contextuality means that the entire measurement setup has to be taken in consideration for the context of outcome, i.e. if Alice put her polarizer orthogonal to Bob this will have an effect on the outcome of photon B = non-locality...
 
Last edited:
  • #126
DrChinese said:
An example of a contextual model can be seen if a context is considered to reside in the future. If Alice and Bob can signal from the future to the past as to what they are planning to measure, then entangled state correlations are easier to explain. Nothing needs to propagate faster than c for such mechanism to operate (and to properly describe any existing experiments).

I’m lost DrC... how could “Alice and Bob signal from the future” without an FTL-mechanism that is forbidden by both QM and SR?
 
  • #127
Jano L. said:
I am looking forward to it. However, the argument was about something different: that deterministic theory is a special kind of probabilistic theory. I am quite interested what others think about this.

Yea - but in discussing that you raised the issue. I still believe it is. Indeed if it isn't then Kochen-Sprecker is in deep trouble because that's what it assumes - namely for QM to be deterministic you need to be able to define a measure of only 0 and 1.

I have started a thread over in that sub-forum about it and already their are some interesting replies.

Thanks
Bill
 
  • #128
Here I go again...
From Age of Entanglement:

"...But what if we let relativity enter the game even deeper? What if the detectors are in relative motion such that each detector in its own reference frame analyzes its photon before the other?...

"...once one assumes that the collapse is a real phenomenon, and once one considers specific models, then the conflict is real and testable"...if both measurements happen before the other, then the quantum correlation should disappear, however large the speed of the spooky action!

"Once the engineering was made feasible, "this experiment was also performed in Geneva in the spring of 1999", reported Gisin. "The two-photon interferences were still visible, independently of the relative velocity between Alice and Bob's reference frames." Alice, in her reference frame, measures her photon first; from Bob's point of view, he has measured his photon first; yet, the correlation is still present..."
 
  • #129
Charles Wilson said:
From Age of Entanglement:

Ah! :!) Thank you for reminding me. I must get that book, NOW!
 
  • #130
stevendaryl said:
I did a Google search on the phrase "non-contextuality" and although it gets many hits, I still don't really understand what it means. Can someone give a real definition, and briefly explain why it's relevant in interpretations of quantum mechanics?

Conceptually its very simple. Suppose you have some observable A = sum y1|b1><b1| + y2|b2><b2| + y3 |b3><b3| where |b3> means outcome |b1> or |b2> did not occur. Outcome |b1> occurs with probability |<u|b1>|^2 from the Born rule. Now consider the observable C = y1 |b1><b1| + c2|c2><c2| + c3|c3><c3|. Now from the Born rule outcome |b1> will occur with exactly the same probability even though the second outcome is different. This is known as non-contextuality because a property does not depend on what else you happen to be measuring with it. It allows a probability measure to be uniquely defined regardless of what basis it is part of ie the other possible outcomes of an observation. Now it turns out, due to Gleason's theorem, that the assumption of non-contextuality all by itself is enough to prove Born's rule. In fact it would be a pretty silly choice of Hilbert space as the formalism for the states of QM if it wasn't true. This is what's meant by non-contextuality being unnatural and counter-intuitive.

But now look at it physically and forgetting the Hilbert space formalism. We have zero reason to believe that changing what else you measure will not affect other things you measure at the same time - after all you have a different apparatus. This is what's meant by the physical basis is unclear. And indeed interpretations of QM such as Bohmian Mechanics exist that are contextual.

Thanks
Bill
 
Last edited:
  • #131
DevilsAvocado said:
I agree, I always thought that contextuality means that the entire measurement setup has to be taken in consideration for the context of outcome, i.e. if Alice put her polarizer orthogonal to Bob this will have an effect on the outcome of photon B = non-locality...

It means measurements are not affected by what else you happen to be measuring at the same time. If you have an observable A = sum ai |bi><bi| the probability of outcome |b1> does not depend on the other |bi>. What you mention is just one example.

Thanks
Bill
 
Last edited:
  • #132
Charles Wilson said:
"...once one assumes that the collapse is a real phenomenon, and once one considers specific models, then the conflict is real and testable"...if both measurements happen before the other, then the quantum correlation should disappear, however large the speed of the spooky action!

But that is precisely what many interpretations such as the ensemble interpretation specifically deny ie its a real phenomena. In that interpretation the state is simply something that tells us about the probabilistic behavior of a conceptual ensemble of systems. Collapse is simply selecting an element of the ensemble - nothing in a real sense occurred. Since such is possible I am at a loss to understand stuff like the above - all you are arguing for is interpretations where its not real - not that QM has any issues.

Thanks
Bill
 
Last edited:
  • #133
Yeesh!
I'll admit the pull quote is not rigorous. I hate using the word "collapse" because I think it's loaded from Bohr's Metaphysics. "But everybody else uses it..."
"So if everybody sez they're going to jump off a cliff, are you going to follow 'em?"

Well...ummm...no. I thought it was valuable to add into the discussion, however, since it pushed the tension between SR and QM. I thought that it was interesting that the Naive Realist position (Not necessarily G E Moore's NR) would state that the interference patterns should disappear and they do not. QM wins again!

But SR still believes, in their reference frame, that they won.

That's why I posted.

CW
 
  • #134
... but electromagnetism is 10^39 times stronger than gravitation ...

I think the Bohr model is pretty dead, there are incompatibilities to empirical spectral lines, and it also violates the uncertainty principle, and even if you magically could fix all that – where is your single localized particle in the double-slit experiment?

It doesn’t work...

How does the Bohr model violate the uncertainty principle? The Bohr model does correctly predict spectral lines for atomic hydrogen. That is something at least. I get the impression that we feel very confident we have exhausted all mechanical analogies for things like the double slit experiment when we cannot possibly have. There are an infinitude of possible mechanical analogies we could use to model a particle like an electron (and all but one will be wrong). I don't know if anyone has seen the experiments with silicon droplets but I saw these a while back and found them very interesting:

http://www.youtube.com/watch?v=nmC0ygr08tE

Also Morgan Freeman narrates:
http://www.youtube.com/watch?v=sGCtMKthRh4
 
  • #135
Charles Wilson said:
*"...But what if we let relativity enter the game even deeper? What if the detectors are in relative motion such that each detector in its own reference frame analyzes its photon before the other?...

*"...once one assumes that the collapse is a real phenomenon, and once one considers specific models, then the conflict is real and testable"

*it will be done

Fundamental quantum optics experiments conceivable with satellites -- reaching relativistic distances and velocities
http://arxiv.org/abs/1206.4949
the line up:
David Rideout, Thomas Jennewein, Giovanni Amelino-Camelia, Tommaso F. Demarie, Brendon L. Higgins, Achim Kempf, Adrian Kent, Raymond Laflamme, Xian Ma, Robert B. Mann, Eduardo Martin-Martinez, Nicolas C. Menicucci, John Moffat, Christoph Simon, Rafael Sorkin, Lee Smolin, Daniel R. Terno.

Super Physics Smackdown: Relativity vs Quantum Mechanics...In Space

Read more: http://www.technologyreview.com/view/428328/super-physics-smackdown-relativity-v-quantum-mechanicsin-space/#ixzz2UyZfdG1L
From MIT Technology Review

*it will be done
(objective reduction models)

Observation of a kilogram-scale oscillator near its quantum ground state
http://iopscience.iop.org/1367-2630/11/7/073032/pdf/1367-2630_11_7_073032.pdf

http://eprints.gla.ac.uk/32707/1/ID32707.pdf

the line up:
B Abbott, R Abbott, R Adhikari, P Ajith, B Allen, G Allen, R Amin, S B Anderson, W G Anderson,
, M A Arain, , M Araya, H Armandula, P Armor, Y Aso, S Aston, P Aufmuth...
 
Last edited:
  • #136
stevendaryl said:
That leaves me a little puzzled, still. Maybe somebody can give examples of a toy model that is contextual?

What I do understand is this: The argument that it is impossible for a hidden-variables theory that reproduces the predictions of quantum mechanics in an EPR-type twin particle experiment assumes that the hidden variable is unaffected by choices made at the measuring devices. But such an effect is ruled out by locality, so I don't see why "contextuality" matters in that case.

http://digital.library.pitt.edu/u/ulsmanuscripts/pdf/31735033440391.pdf
 
  • #137
DevilsAvocado said:
I’m lost DrC... how could “Alice and Bob signal from the future” without an FTL-mechanism that is forbidden by both QM and SR?

I thought the same, I still don't know what to make of that answer.
 
  • #138
DevilsAvocado said:
I’m lost DrC... how could “Alice and Bob signal from the future” without an FTL-mechanism that is forbidden by both QM and SR?

I'm not sure if this is what he meant, but there are "time-symmetric" formulations of wave propagation in which the future affects the present in the same way the past does. It's not FTL, in the sense that propagation speed is always <= c, although the propagation can be into the past as well as into the future. This is consistent with SR in the weak sense that there is no violation of lorentz invariance.
 
  • #139
bhobba said:
Conceptually its very simple. Suppose you have some observable A = sum y1|b1><b1| + y2|b2><b2| + y3 |b3><b3| where |b3> means outcome |b1> or |b2> did not occur. Outcome |b1> occurs with probability |<u|b1>|^2 from the Born rule. Now consider the observable C = y1 |b1><b1| + c2|c2><c2| + c3|c3><c3|. Now from the Born rule outcome |b1> will occur with exactly the same probability even though the second outcome is different. This is known as non-contextuality because a property does not depend on what else you happen to be measuring with it. It allows a probability measure to be uniquely defined regardless of what basis it is part of ie the other possible outcomes of an observation. Now it turns out, due to Gleason's theorem, that the assumption of non-contextuality all by itself is enough to prove Born's rule. In fact it would be a pretty silly choice of Hilbert space as the formalism for the states of QM if it wasn't true. This is what's meant by non-contextuality being unnatural and counter-intuitive.

But now look at it physically and forgetting the Hilbert space formalism. We have zero reason to believe that changing what else you measure will not affect other things you measure at the same time - after all you have a different apparatus. This is what's meant by the physical basis is unclear. And indeed interpretations of QM such as Bohmian Mechanics exist that are contextual.

Thanks
Bill
I guess you meant "This is what's meant by contextuality being unnatural and counter-intuitive."

Bill, I find this answer quite reasonable.
Maybe you can help me understand this better. Especially the part where the physical basis is unclear which connects with the QM completeness/incompleteness issue.
When you say that the Hilbert formalism is silly if we don't assume non-contextuality I interpret you mean that non-contextuality brings an independent probabilistic picture and that independence fits well with the superposition principle and therefore vector spaces.
Going back to Gleason's theorem and why I was associating a point-particle model to the Born rule, the theorem proves that the Born rule for the probability of obtaining specific results to a given measurement, follows naturally from the structure formed by the lattice of events in a real or complex Hilbert space. Now lattices are discrete subgroups and are formed by points (zero-dimensional discrete topology) that can be physically interpreted as point particles. but the key starting point is the Hilbert space, its linearity allows the Born rule to be interpreted as following naturally from the points-events lattice.

As you rightly say forgetting for a moment the Hilbert formalism we don't have compelling reasons to rule out contextuality, but the only popular contextual interpretation seems to be BM, probably because the hilbert formalism tightly limits what one can make with a contextual interpretation.
 
  • #140
stevendaryl said:
I'm not sure if this is what he meant, but there are "time-symmetric" formulations of wave propagation in which the future affects the present in the same way the past does. It's not FTL, in the sense that propagation speed is always <= c, although the propagation can be into the past as well as into the future. This is consistent with SR in the weak sense that there is no violation of lorentz invariance.

Something like Feynman-Wheeler absorber theory? but that was refuted many years ago, among other things assumed no self-interaction of particles.
 

Similar threads

Replies
10
Views
1K
Replies
15
Views
2K
Replies
36
Views
4K
Replies
5
Views
2K
Replies
69
Views
5K
Replies
11
Views
2K
Back
Top