Murray Gell-Mann on Entanglement

  • I
  • Thread starter Thecla
  • Start date
  • Tags
    Entanglement
  • Featured
In summary: I think it's a little more subtle than "non-local means measurement-dependent".In summary, most physicists working in this field agree that when you measure one of the photons it does something to the other one. It doesn't mean that they reject non-locality.
  • #281
A. Neumaier said:
The moon need only to have a mean trajectory, given by the expectation of the center of mass of the position operators of its atoms. Its standard deviation is far below the radius of the moon and hence negligible.

Yes. If there were actually a proof that the laws of quantum mechanics implies that macroscopic objects have negligible standard deviation in their position, then there wouldn't be a measurement problem. But it doesn't seem to me that there could be such a proof. Imagine an isolated system consisting of an experimenter, a Stern-Gerlach device, and a source of electrons. The experimenter puts an electron into a state of spin-up in the x-direction, then later measures the spin in the z-direction. If it's spin-up, he goes to Rome, and if it's spin-down, he goes to Paris. It seems to me that the quantum mechanical evolution of the entire system would result in a 50% probability of the experimenter going to Rome, and a 50% probability of the experimenter going to Paris. The standard deviation of his position would be huge.
 
Physics news on Phys.org
  • #282
stevendaryl said:
Yes. If there were actually a proof that the laws of quantum mechanics implies that macroscopic objects have negligible standard deviation in their position, then there wouldn't be a measurement problem.
For properly normalized extensive macroscopic properties (and this includes the center of mass operator), there is such a proof in many treatises of statistical mechanics. It is the quantum analogue of the system size expansion for classical stochastic processes. For example, see Theorem 9.3.3 and the subsequent discussion in my online book. But you can find similar statements in all books on stochastic physics where correlations are discussed in a thermodynamic context if you care to look, though usually for different, thermodynamically relevant variables.

This property (essentially a version of the law of large numbers) is indispensable for the thermodynamic limit that justifies thermodynamics microscopically, since in this limit all uncertainties disappear and classical thermodynamics and hydromechanics appear as effective theories.

The measurement problem appears only because people mistake the highly idealized von Neumann measurement (treated in introductory texts) - which applies only to very specific collapse-like measurements such as that of electron spin - for the general notion of a measurement, and therefore are lead to interpreting the reading from a macroscopic instrument in these terms, inventing for it a collapse that has no scientific basis.

And unfortunately, physics education is today so fragmentized and research so specialized that people working on resolving issues in the quantum foundations typically never had an in-depth education in statistical mechanics. As a consequence they believe that the textbook foundations are the real ones...

As for your thought experiment, the experimenter cannot travel if the system you describe is truly isolated. But once it is not isolated, your argument breaks down.
 
  • #283
stevendaryl said:
Imagine an isolated system consisting of an experimenter, a Stern-Gerlach device, and a source of electrons. The experimenter puts an electron into a state of spin-up in the x-direction, then later measures the spin in the z-direction. If it's spin-up, he goes to Rome, and if it's spin-down, he goes to Paris. It seems to me that the quantum mechanical evolution of the entire system would result in a 50% probability of the experimenter going to Rome, and a 50% probability of the experimenter going to Paris. The standard deviation of his position would be huge.
A. Neumaier said:
As for your thought experiment, the experimenter cannot travel if the system you describe is truly isolated. But once it is not isolated, your argument breaks down.
Yes, I think Arnold has a point here. The closest we come to an isolated system in this case is the Earth itself, and the experimenter going to Rome or Paris would not influence the Earth's center of gravity trajectory, nor its standard deviation.
 
  • #284
Demystifier said:
But one of the reasons it [QM] hasn't failed so far is because it remained agnostic on many interesting questions.

A. Neumaier said:
on many interesting questions that can be checked experimentally? What would be an example?

Demystifier said:
What orientation of the Stern-Gerlach apparatus will the experimentalist freely choose in the next experimental run. :biggrin:

That's not a good example! :biggrin:

The canonical example of an interesting question, which can be checked experimentally, which QM is agnostic on, is simply: what value will this measurement give? For instance consider a particle with definite z spin. When measured in x direction, it will be spin up or down, 50/50. QM is agnostic regarding which of these will happen. Indeed standard QM says it's impossible to predict; but that's an unprovable over-statement. You may think this is trivial, but it's not. It's the key difference between QM and classical.

Note that according to QM we could predict the result perfectly IF we had access to info outside the particle's past light cone. In particular, if we could access the next second of its future light cone. The typical Bell-type experimental situation is similar. If Bob had access to Alice's measurement, outside his past light cone, he could predict his own measurement (perfectly, if at the same angle). From this point of view we can say that the essential peculiarity of QM, compared to classical, is that in order to completely predict results, info beyond the past light cone is required.

Anyway Demystifier's statement is justified. A traditional classical physicist - such as Einstein - considers it "cheating" for QM to simply refuse to predict (one single) experimental result. If we ever come up with a new, deeper, theory that can do that, Demystifier's (and Einstein's) point would become obvious and accepted by all. Until then, it remains rather subtle and requires some cogitation to appreciate.
 
  • Like
Likes Demystifier
  • #285
secur said:
the essential peculiarity of QM, compared to classical, is that in order to completely predict results, info beyond the past light cone is required.
This is a misunderstanding. In classical relativistic physics, in order to completely predict results, info beyond the past light cone of the here-and-now is also required!
 
  • #286
A. Neumaier said:
As for your thought experiment, the experimenter cannot travel if the system you describe is truly isolated. But once it is not isolated, your argument breaks down.

We've been through this before, and it still doesn't make any sense to me. There is nothing in quantum mechanics that bounds the standard deviation of a variable such as position. A single electron can be in a superposition of being here, and being 1000 miles away. A single atom can be in such a superposition. A single molecule can be in such a superposition. There is nothing in quantum mechanics that says that a macroscopic object can't be in such a superposition.

Some people say that decoherence prevents such superpositions, but the way I understand decoherence, what it really does is to rapidly cause the superposition to spread, to eventually "infect" the entire causally connected universe.
 
  • #287
Heinera said:
Yes, I think Arnold has a point here. The closest we come to an isolated system in this case is the Earth itself, and the experimenter going to Rome or Paris would not influence the Earth's center of gravity trajectory, nor its standard deviation.

The only significance of being "isolated" is that isolation is needed to be able to talk about the state of a subsystem. Because of decoherence, if you tried to place a macroscopic object into a macroscopic superposition, the superposition would rapidly spread to the entire universe. So we can't actually analyze macroscopic superpositions unless (a la many-worlds) we are willing to consider the wave function of the entire universe.

But conceptually, we can imagine a composite system consisting of an electron plus a macroscopic measuring device. If the electron being spin-up results in the measuring device going into macroscopic state U, and the electron being spin-down results in the measuring device going into macroscopic state D, then the electron being in a superposition of spin-up and spin-down would result in the measuring device going into a superposition of those two states. That's a consequence of linearity.
 
  • Like
Likes eloheim
  • #288
Stevendaryl, to my understanding, decoherence is just the result of the reversibility of a system becoming extremely unlikely through chain interactions. The farther part the states you refer to are, the more difficult it is to maintain said reversibility.
 
  • Like
Likes eloheim
  • #289
Jilang said:
Stevendaryl, to my understanding, decoherence is just the result of the reversibility of a system becoming extremely unlikely through chain interactions. The farther part the states you refer to are, the more difficult it is to maintain said reversibility.

I agree with that. But decoherence figures into discussions about macroscopic simulations in the following way: Once decoherence happens, it becomes mathematically intractable to describe the quantum state as a superposition, so it is instead described as a mixed state. But my claim is that there is nothing in quantum mechanics that would then select a single alternative out of the set of possibilities described by that mixed state.
 
  • Like
Likes eloheim, secur and ddd123
  • #290
stevendaryl said:
Some people say that decoherence prevents such superpositions, but the way I understand decoherence, what it really does is to rapidly cause the superposition to spread, to eventually "infect" the entire causally connected universe.
No it can't. The more things that become 'infected' the higher the probabiity that one of them will interact and bring it all to an end.
The natural spread extent is very small in space and time.
 
  • #291
Isn't that where the Born rule comes into play? Doesn't it just select the appropriate one for the detector?
 
  • #292
Mentz114 said:
No it can't. The more things that become 'infected' the higher the probabiity that one of them will interact and bring it all to an end.
The natural spread extent is very small in space and time.

This sounds like GRW.

But I agree with stevendaryl on everything here, it's unclear where the pure superposition is supposed to end.
 
  • Like
Likes eloheim
  • #293
ddd123 said:
This sounds like GRW.
I'll look up GRW.
I was extending the viral anology. It probably won't work unless there are fewer interactions that multiply than those that fix.

..., it's unclear where the pure superposition is supposed to end.
I wish I knew. Is the 'end' even defined ?
 
  • #294
stevendaryl said:
But my claim is that there is nothing in quantum mechanics that would then select a single alternative out of the set of possibilities described by that mixed state.
If you allow dissipative sub-systems in QT then it is the initial conditions that decide the outcome.
 
  • #295
Mentz114 said:
I wish I knew. Is the 'end' even defined ?

In usual quantum theory, when you look at a measurement instrument's pointer it's pretty defined at that point :D but you have Avogadro's number like orders of magnitudes in between to narrow it down further.
 
  • #296
ddd123 said:
This sounds like GRW.
I looked up the Ghirardi-Rimini-Weber theory (GRW) and it is sort of similar to what I posted. Thanks for telling me about it.
 
  • #297
Mentz114 said:
If you allow dissipative sub-systems in QT then it is the initial conditions that decide the outcome.

Do they? That would seem to mean that if you are trying to measure the spin of an electron, then initial conditions in the measuring device determine the final measurement result. That's a kind of hidden-variable theory, except that the variable is not in the thing being measured, but in the thing doing the measurement.

I would think that that would cause problems for EPR. There, you produce a pair of correlated spin-1/2 particles. I don't see how initial conditions in the two distant measuring devices could conspire to always produce anti-correlated results.
 
  • #298
Mentz114 said:
No it can't. The more things that become 'infected' the higher the probabiity that one of them will interact and bring it all to an end.
The natural spread extent is very small in space and time.

An interaction doesn't reduce a superposition to a single value; it instead causes one subsystem that is in a superposition to cause a second subsystem to also be in a superposition. That's what I mean by the superposition spreading to infect the rest of the universe.
 
  • #299
Mentz114 said:
I looked up the Ghirardi-Rimini-Weber theory (GRW) and it is sort of similar to what I posted. Thanks for telling me about it.

But that theory isn't standard QM, it's a proposed alternative theory.
 
  • #300
secur said:
... the essential peculiarity of QM, compared to classical, is that in order to completely predict results, info beyond the past light cone is required.

A. Neumaier said:
This is a misunderstanding. In classical relativistic physics, in order to completely predict results, info beyond the past light cone of the here-and-now is also required!

I certainly thought that in classical relativistic physics the past light cone(s) of the objects in question (including the space, of course, with its curvature; and the stress-energy tensor) contain all info that could possibly affect the physics. And, theoretically perfect prediction is possible. (In fact given that the theory is local all you really need is "here-and-now" information - anything in contact - but that's not relevant at the moment). Can you please explain further?

[EDIT] assume there's only one inertial frame used for both observations and predictions ... I can't think of any other loopholes I might be missing
 
Last edited:
  • #301
stevendaryl said:
But that theory isn't standard QM, it's a proposed alternative theory.
I don't claim anything for GRW. It has a passing similarity to what I was thinking.
 
  • #302
secur said:
Anyway Demystifier's statement is justified. A traditional classical physicist - such as Einstein - considers it "cheating" for QM to simply refuse to predict (one single) experimental result. If we ever come up with a new, deeper, theory that can do that, Demystifier's (and Einstein's) point would become obvious and accepted by all. Until then, it remains rather subtle and requires some cogitation to appreciate.

I think one has to realize that this point is absolutely standard, and that one can take Bohr or Einstein's view coherently. What is being debated here is whether the claims by vanhees71 (following Ballentine), arnold neumaier etc are correct - their views do not fall into either the class of Bohr or of Einstein's. Certainly, they are not textbook views - one would have to believe that Bohr, Einstein, Dirac, Landau & Lifshitz, Cohen-Tannoudji, Diu, Laloe, Bell, Weinberg etc all failed to understand quantum mechanics.
 
  • #303
secur said:
I certainly thought that in classical relativistic physics the past light cone(s) of the objects in question (including the space, of course, with its curvature; and the stress-energy tensor) contain all info that could possibly affect the physics. And, theoretically perfect prediction is possible. (In fact given that the theory is local all you really need is "here-and-now" information - anything in contact - but that's not relevant at the moment). Can you please explain further?

[EDIT] assume there's only one inertial frame used for both observations and predictions ... I can't think of any other loopholes I might be missing

Maybe he's referring to "cosmic censorship" scenarios in general relativity, it's the only example I know of where that stops being true.
 
  • #304
stevendaryl said:
Do they? That would seem to mean that if you are trying to measure the spin of an electron, then initial conditions in the measuring device determine the final measurement result. That's a kind of hidden-variable theory, except that the variable is not in the thing being measured, but in the thing doing the measurement.
You misunderstand. The initial conditions happen after preparation and before measurement.
 
  • #305
stevendaryl said:
But my claim is that there is nothing in quantum mechanics that would then select a single alternative out of the set of possibilities described by that mixed state.

Yes there is: the so-called collapse, when a measurement is made. Of course you mean, apart from that.

Mentz114 said:
If you allow dissipative sub-systems in QT then it is the initial conditions that decide the outcome.

GRW posits spontaneous collapse. Presumably that has a "passing similarity" to your "dissipative subsystems"? But stevendaryl's response applies equally well to your idea, as to GRW:

stevendaryl said:
But that theory isn't standard QM, it's a proposed alternative theory.
 
  • #306
Mentz114 said:
You misunderstand. The initial conditions happen after preparation and before measurement.

I don't see how that could work. If an electron being spin-up causes a detector to enter state [itex]UP[/itex], and an electron being spin-down causes a detector to enter state [itex]DOWN[/itex], then by the linearity of the evolution equations of quantum mechanics, an electron in a superposition of spin-up and spin-down would cause a detector to enter into a superposition of states [itex]UP[/itex] and [itex]DOWN[/itex], if there is nothing going on in the detectors other than ordinary quantum mechanics.

Of course, something macroscopic like a detector will interact with the environment, which enormously complicates things. But the same thing applies to electron + detector + environment: Linearity of quantum mechanics would that the composite system will enter into a superposition. So, we would end up with a superposition of two different composite states: [itex]|\psi_{up}\rangle[/itex], where the electron is spin-up, and the detectors detects spin-up, and the environment is whatever condition is appropriate for the environment interacting with a detector that detected spin-up, and [itex]|\psi_{down}\rangle[/itex], where all three components are in states appropriate for the electron being spin-down.

It doesn't make sense to say that details of the detector, or the environment, will cause it to shift to just one "branch". That would violate the linearity of the evolution equations. You could propose new, nonlinear corrections to quantum mechanics that might accomplish the kind of objective collapse that you're talking about, but it isn't possible in standard quantum mechanics. (Unless you consider wave function collapse to be part of standard quantum mechanics, which some people do.)
 
  • #307
secur said:
Yes there is: the so-called collapse, when a measurement is made. Of course you mean, apart from that.

Right, the issue is whether a separate "collapse" hypothesis is needed, or whether the effect of collapse is derivable from just unitary quantum evolution.
 
  • #308
atyy said:
I think one has to realize that this point is absolutely standard, and that one can take Bohr or Einstein's view coherently. What is being debated here is whether the claims by vanhees71 (following Ballentine), arnold neumaier etc are correct - their views do not fall into either the class of Bohr or of Einstein's. Certainly, they are not textbook views - one would have to believe that Bohr, Einstein, Dirac, Landau & Lifshitz, Cohen-Tannoudji, Diu, Laloe, Bell, Weinberg etc all failed to understand quantum mechanics.

I'm sorry, your comment seems orthogonal to my post. Please make the connections (which, no doubt, exist) explicit, if you like.

ddd123 said:
Maybe he's referring to "cosmic censorship" scenarios in general relativity, it's the only example I know of where that stops being true.

This brings up an interesting point. "Cosmic Censorship" - which of course is only a conjecture - proposes that a naked singularity never happens. Therefore if the evolution of some system would lead to that, it won't - instead it will do something else. On the face of it that sounds like Nature must "look ahead" to see the result of some process, and if Nature sees that it will be "censored", then it changes the (other) laws of physics in this one instance, to avoid that "illegal" outcome. That's teleology.

Ignore QM entirely for this discussion, stick to purely classical, because QM can confuse the following points I want to make.

For perspective consider the conservation laws of energy and momentum, applied to a couple of (perfectly elastic) billiard balls. As we all know you can determine how they'll bounce off each other most easily by applying those conservation laws. The two resulting simultaneous equations are easily solved. But certainly we don't normally think that Nature does such a look-ahead computation. Rather the billiard ball trajectories evolve via differential equations, "contact transformations", according to Newton's laws of motion and the law of elastic collision. Nature never "looks ahead" during this process. But it so happens that, when the collision is done and the balls are heading off to infinity, energy and momentum have been conserved.

There are many similar examples, e.g. various forms of the Action Principle. Many places where by solving the original dynamical differential equations we come up with (very useful) global constraints, expressed as integral equations. Loosely we say that Nature "must obey" these. But - in the normal ontology of classical physics - we don't imagine Nature is looking ahead, beyond the past light cone, to decide what to do. The instant-by-instant diff EQ's are all Nature knows about.

It's the same for Cosmic Censorship. If it's true that Nature never "allows" a naked singularity, it must happen due to ordinary physical laws (including, perhaps, currently-unknown ones) which operate only on the currently available info (past light cone) in such a way that, it turns out, naked singularity never happens.

You may be right that A. Neumaier is thinking of something like this; of course, we don't know. I was planning to give the above answer if he did respond as you suggest.

This general issue of "teleology in physics" is wandering off-topic; there's a lot more one could say about it. Bottom line, I think it should always be viewed as merely a convenient heuristic - sometimes very convenient - but Nature never really does "look ahead". Ignoring QM, where it's not so clear.

stevendaryl said:
Right, the issue is whether a separate "collapse" hypothesis is needed, or whether the effect of collapse is derivable from just unitary quantum evolution.

For what my opinion's worth, it seems very clear that mere unitary quantum evolution can't do it. You need an extra hypothesis to explain the collapse. Every alternative interpretation has one - including MWI, despite their claim that they don't.
 
  • #309
stevendaryl said:
There is nothing in quantum mechanics that bounds the standard deviation of a variable such as position. A single electron can be in a superposition of being here, and being 1000 miles away. A single atom can be in such a superposition. A single molecule can be in such a superposition. There is nothing in quantum mechanics that says that a macroscopic object can't be in such a superposition.
See the new thread https://www.physicsforums.com/threads/the-typical-and-the-exceptional-in-physics.885480/
 
  • #310
ddd123 said:
it's unclear where the pure superposition is supposed to end.
It is nowhere there in the first place. It is an artifact of the initial idealization.
 
  • #311
secur said:
Can you please explain further?
One needs the information on a Cauchy surface, not on the past light cone, to make predictions. More precisely, to predict classically what happens at a point in the future of a given observer, the latter's present defines (at least in sufficiently nice spacetimes) a Cauchy surface where all information must be available to infer the desired information.
it is no different in quantum mechanics when one makes (probabilistic) predicions. The apex of the light cone is the point in space-time at which all information needed to do the statistics is available. See https://www.physicsforums.com/posts/5370260 and the discussion there between post #187 and #230.
 
  • Like
Likes vanhees71
  • #312
stevendaryl said:
Under what circumstances does an electron measure its own spin? Never, right? So it doesn't make any sense at all to say that an isolated electron has a 50% probability of being spin-up in the z-direction. What about a pair of electrons? When does one electron measure the spin of another electron? Never, right? So for a pair of electrons, probability doesn't make any sense.

Probability only makes sense for an interaction in which one of the subsystems is a macroscopic measuring device.
You measure the spin, e.g., with a Stern-Gerlach apparatus, which leads to an entanglement between the spin component and the position of the particle, which then can be detected. All you know about the outcome of such a measurement is that with 50% probability you find the one or the other possible value of this quantity. Of course, this doesn't tell you much (in fact as little as possible in the sense of information theory) about a single spin. Probabilities in practice are relative frequencies of the occurance of the property when you perform measurements on an ensemble of independently prepared spins in this state. I don't know, why we have to repeat this all the time in our discussions. It's common practice with all experiments in all labs around the globe!
 
  • #313
stevendaryl said:
To me, if you and your equipment are all described by the same physics as electrons and photons, etc., then to say that "I prepared things in such-and-such a way" means "Me and my equipment were put into such and such a macroscopic state". So there is a notion of "state" for macroscopic objects that does not depend on yet another system to prepare them in that state. They can put themselves into a particular state. But you're saying that for an electron, or a photon, or any microscopic system, the only notion of state is a preparation procedure by a macroscopic system. That seems incoherent to me. At best, it's a heuristic, but it can't possibly be an accurate description of what's going on. If macroscopic systems have properties without being observed, then why can't microscopic systems?
Common practice today discproves you. It has become more and more possible in the recent decades to handle single particles and photons and prepare them in many kinds of pure and mixed states, everything in accordance with standard QT.
 
  • #314
vanhees71 said:
Common practice today discproves you.
So are you saying that particle (microscopic system) can acquire definite quantum state spontaneously?
 
  • #315
No, they require a definite quantum state by being prepared in it. I don't know what you mean by "spontaneously".
 

Similar threads

Replies
5
Views
1K
  • Art, Music, History, and Linguistics
Replies
11
Views
1K
  • Quantum Physics
Replies
7
Views
1K
Replies
22
Views
2K
Replies
19
Views
2K
Replies
40
Views
4K
Replies
33
Views
2K
  • General Discussion
Replies
7
Views
2K
Replies
3
Views
2K
  • Quantum Interpretations and Foundations
Replies
31
Views
2K
Back
Top