Murray Gell-Mann on Entanglement

  • I
  • Thread starter Thecla
  • Start date
  • Tags
    Entanglement
  • Featured
In summary: I think it's a little more subtle than "non-local means measurement-dependent".In summary, most physicists working in this field agree that when you measure one of the photons it does something to the other one. It doesn't mean that they reject non-locality.
  • #316
atyy said:
I think one has to realize that this point is absolutely standard, and that one can take Bohr or Einstein's view coherently. What is being debated here is whether the claims by vanhees71 (following Ballentine), arnold neumaier etc are correct - their views do not fall into either the class of Bohr or of Einstein's. Certainly, they are not textbook views - one would have to believe that Bohr, Einstein, Dirac, Landau & Lifshitz, Cohen-Tannoudji, Diu, Laloe, Bell, Weinberg etc all failed to understand quantum mechanics.
According to Ballentine, it is really the case that all these mentioned men failed to understand quantum mechanics properly.

Each of them (including Ballentine) has a slightly different view of QM. Personally I like the Bell's view the most, but I see some merits in all of them.
 
Physics news on Phys.org
  • #317
vanhees71 said:
No, they require a definite quantum state by being prepared in it. I don't know what you mean by "spontaneously".
With "spontaneously" I mean property of macroscopic systems to prepare themselves in definite state as suggested by stevendaryl:
stevendaryl said:
So there is a notion of "state" for macroscopic objects that does not depend on yet another system to prepare them in that state.
 
  • #318
Demystifier said:
According to Ballentine, it is really the case that all these mentioned men failed to understand quantum mechanics properly.

Each of them (including Ballentine) has a slightly different view of QM. Personally I like the Bell's view the most, but I see some merits in all of them.

That's definitely not true. My view is very conservative and minimal. Weinberg's point of view is, according to his newest textbook on QM, that the interpretation problem is unsolved. Landau&Lifshitz and Dirac are very close to my view. I've never understood Bohr, who used to write very enigmatic papers. Einstein's view is, in my opinion, ruled out by the outcome of Bell experiments. I don't know the other books mentioned well enough to say anything concerning their view on interpretation.
 
Last edited:
  • #319
zonde said:
With "spontaneously" I mean property of macroscopic systems to prepare themselves in definite state as suggested by stevendaryl:
Think about it. A macroscopic system tends to "prepare itself" in a state of (local) thermal equilibrium (let's not consider systems with long-ranged forces for the moment) but that takes time. So I still don't know, what you mean by "spontaneously".
 
  • #320
A. Neumaier said:
It is nowhere there in the first place. It is an artifact of the initial idealization.

That's a very interesting statement. I'm not sure I understand your view on things so is it possible to clarify what you mean here?

Are you suggesting that the 'textbook' axioms are incorrect (I think you called them 'ridiculous' in another post)?

Or are you suggesting that superposition (in a quantum sense) is not a physical phenomenon?

I agree that it may be extremely practically difficult to devise an experiment that is capable of testing an 'idealized' quantum system - so I can see why it might be possible to say that the idealized axioms don't apply FAPP - but is it your view that the idealized axioms are actually wrong?

Or is your view that the states and wavefunctions and mathematical machinery of QM is nothing more than a collection of mathematical devices, divorced from 'reality', that allows us to calculate probabilities in experiments? So the maths gets us the right answers but tells us absolutely nothing about what might be 'going on'?
 
  • #321
vanhees71 said:
That's definitely not true. My view is very conservative and minimal. Weinberg's point of view is, according to his newest textbook on QM, that the interpretation problem is unsolved. Landau&Lifshitz and Dirac are very close to my view. I've never understood Bohr, who used to write very enigmatic papers. Einstein's view is, in my opinion, ruled out by the outcome of Bell experiments. I don't know the other books mentioned well enough to say anything concerning their view on interpretation.
Is it so hard to press the Quote button? :cool:
 
  • #322
Demystifier said:
Is it so hard to press the Quote button? :cool:
I usually only quote a message, if my answer is not directly after the message I refer to. That's not working with this thread, because the frequency of answers is too high. zonde was quicker with his posting than I could write mine. For clarity I copied the quote into my message. Sorry for the confusion.
 
  • #323
vanhees71 said:
That's definitely not true.
What exactly is definitely not true? I really think that Ballentine thinks that most of the others have not understood QM properly.
 
  • #324
Demystifier said:
What exactly is definitely not true? I really think that Ballentine thinks that most of the others have not understood QM properly.
It's definitely not true that I think that all the "founding fathers" of QT are wrong or haven't understood their own theory. Ballentine, in my opinion, also follows just standard QT. He's even emphasizing the bare physics content of it, and there's no contradiction of "minimal interpretation" to the Copenhagen flavor without collapse. As I said, I never understood Bohr completely, but as far as I can see he had a pretty similar view, taking the quantum states as epistemic.
 
  • #325
Simon Phoenix said:
Are you suggesting that the 'textbook' axioms are incorrect
They are appropriate for an introductory course where emphasis is on simple, paradigmatic systems. But already a simple position measurement is not covered, since it cannot collapse to an eigenstate - position has no normalizable eigenstates. Realistic measurement is a highly complex subject, not something appropriate for foundations.
 
  • Like
Likes dextercioby
  • #326
vanhees71 said:
Ballentine, in my opinion, also follows just standard QT. He's even emphasizing the bare physics content of it, and there's no contradiction of "minimal interpretation" to the Copenhagen flavor without collapse.
But why then Ballentine made a wrong prediction about the quantum Zeno effect? Is it just a little mistake that can happen to everyone? Or is it a deep disagreement with the others?
 
  • #327
vanhees71 said:
He's even emphasizing the bare physics content of it, and there's no contradiction of "minimal interpretation" to the Copenhagen flavor without collapse.
Can you clarify what you mean by "without collapse"? Is it just a matter of words, calling it update instead of collapse, as from the earlier discussion with atyy? Or is there a difference in the approach to calculations? How do you do without update in all experimental scenarios?
 
  • #328
vanhees71 said:
Think about it. A macroscopic system tends to "prepare itself" in a state of (local) thermal equilibrium (let's not consider systems with long-ranged forces for the moment) but that takes time. So I still don't know, what you mean by "spontaneously".

The point is that a measuring device does not have yet another measuring device measuring it. So the idea that the state of a system is only meaningful in predicting probabilities for what a measuring device would measure is not true for macroscopic systems.
 
  • #329
vanhees71 said:
Common practice today disproves you. It has become more and more possible in the recent decades to handle single particles and photons and prepare them in many kinds of pure and mixed states, everything in accordance with standard QT.

I don't understand how your remarks address what I said. For a microscopic system, the "state" is meaningful in two ways: (1) the preparation procedure needed to put the system in that state, and (2) the probabilities that state gives for future measurements. But for a macroscopic system, there is a notion of state that doesn't have either of those features. A macroscopic system simply is in some definite state or another.
 
  • #330
ddd123 said:
Can you clarify what you mean by "without collapse"? Is it just a matter of words, calling it update instead of collapse, as from the earlier discussion with atyy? Or is there a difference in the approach to calculations? How do you do without update in all experimental scenarios?
There is anyway no difference in calculations when it comes to the physical content of quantum theory. The minimal interpretation is just saying that the state has probabilistic information about the outcome of future measurements and nothing else.
 
  • #331
Demystifier said:
But why then Ballentine made a wrong prediction about the quantum Zeno effect? Is it just a little mistake that can happen to everyone? Or is it a deep disagreement with the others?
Yes, I think that's simply a mistake. Why from the minimal interpretation one should deny the quantum Zeno effect is not clear to me.
 
  • Like
Likes Demystifier
  • #332
Did Ballentine actually change his mind though? From the last time I read wiki, it seemed he was still arguing against the Zeno effect, so there's an impression something is at stake.
 
  • #333
stevendaryl said:
I don't understand how your remarks address what I said. For a microscopic system, the "state" is meaningful in two ways: (1) the preparation procedure needed to put the system in that state, and (2) the probabilities that state gives for future measurements. But for a macroscopic system, there is a notion of state that doesn't have either of those features. A macroscopic system simply is in some definite state or another.
The macroscopic observables, which are an average over a vast amount of microscopic states observables ##^*##, appear classical. Of course, on the microscopic level a macroscopic system is described by QT. There's no contradiction between these too levels of description.

##^*## corrected due to the hint in #340
 
Last edited:
  • #334
ddd123 said:
Did Ballentine actually change his mind though? From the last time I read wiki, it seemed he was still arguing against the Zeno effect, so there's an impression something is at stake.
Can you share the link to that wiki?
 
  • #336
vanhees71 said:
The macroscopic observables, which are an average over a vast amount of microscopic states, appear classical. Of course, on the microscopic level a macroscopic system is described by QT. There's no contradiction between these too levels of description.

Claiming it doesn't make it so. If each of the microscopic states is only meaningful in that it makes predictions for future measurements, then how does a macroscopic state have meaning that doesn't involve future measurements?
 
  • #337
stevendaryl said:
Claiming it doesn't make it so. If each of the microscopic states is only meaningful in that it makes predictions for future measurements, then how does a macroscopic state have meaning that doesn't involve future measurements?

More specifically: Why does an average over a vast number of microscopic states, each of which only meaning in terms of future measurements, produce a macroscopic value that has meaning independent of measurements? That seems like an outlandishly improbable claim. That doesn't make it false, but it shouldn't be a default assumption without further argument supporting it.
 
  • #338
ddd123 said:
Ok, I cannot see a problem with [46]. Indeed there's no "collapse", but just the interaction between the atom (simplified to a three-level toy model) and the RF field that causes the "quantum Zeno effect". So, of course, Ballentine is not denying the measured facts.
 
  • #339
stevendaryl said:
the state of a system is only meaningful in predicting probabilities for what a measuring device would measure is not true for macroscopic systems.
The (always mixed) state of a measurement device indeed fully determines the measurement reading to a very high accuracy, not only probabilistically. That's how measurement devices are made.
 
  • #340
vanhees71 said:
The macroscopic observables, which are an average over a vast amount of microscopic states
over a vast amount of microscopic observables, not states!
 
  • Like
Likes vanhees71
  • #341
A. Neumaier said:
The (always mixed) state of a measurement device indeed fully determines the measurement reading to a very high accuracy, not only probabilistically. That's how measurement devices are made.

I think you misunderstood what I said. There are two states involved here: the state of the measuring device, and the state of the system being measured. The first has a meaning that does not depend on measurements by yet other measuring devices.
 
  • #342
Macroscopic state variables such as the position of the center of mass of a macroscopic object have two features that are different from microscopic state variables: (1) There are no observed interference effects between different states, and (2) they have a small standard deviation (relative to the appropriate scale for the variable; for example, the standard deviation for the position of a brick is typically small compared to the size of the brick). Decoherence explains the first effect, but not the second. Pure quantum mechanics in the minimal interpretation cannot explain why macroscopic state variables have definite (up to a small standard deviation) values.

Bohmian mechanics halfway explains it. According to that interpretation, all objects have definite positions at all times. However, in Bohmian mechanics, the state, or wave function, evolves smoothly at all times, so in those cases where quantum mechanics would predict a large standard deviation, Bohmian gives (or seems to--maybe I'm misunderstanding something) schizophrenic results: The macroscopic object such as a brick is well-localized, since each of its constituent particles is well-localized. On the other hand, the standard deviation, as computed using the wave function, may still be quite large.

Many-worlds attempts (and I'm not sure how successful it is) to say that even though a macroscopic object can have a large standard deviations for its position, that is unobservable. Rather than "seeing" a brick with a large standard deviation, the state of the world splits into different branches, each of which sees the brick as localized.
 
  • #343
stevendaryl said:
I think you misunderstood what I said. There are two states involved here: the state of the measuring device, and the state of the system being measured. The first has a meaning that does not depend on measurements by yet other measuring devices.
Well, your formulation invited the misunderstanding. Anyway, whether the state of a single electron has a meaning at all is one of the controversial points in the foundations. Generally agreed is only that an ensemble of many equally prepared electrons has a state. And this automatically leads to a probabilistic framework.
 
  • Like
Likes vanhees71
  • #345
A. Neumaier said:
over a vast amount of microscopic observables, not states!
true! I've corrected it.
 
  • #347
stevendaryl said:
That's just incorrect. The law of large numbers is not sufficient to explain this effect. You are mistaken.

I think that this might be an insurmountable obstacle to reaching a conclusion, because to me, your [A. Neumaier's] efforts to prove that macroscopic objects have definite positions (give or take a small standard deviation) is assuming your conclusion. It's circular reasoning. You want to launch into the use of density matrices of a particular form that only make sense under the assumption that you're trying to prove.

On the other side, I think I could demonstrate definitely that you are wrong by considering the pure state of an isolated system that includes macroscopic objects. You would refuse to even look at such an argument, because you insist that macroscopic can't have pure states.

So that's an impasse. You reject out of hand the reasoning that would prove you wrong, and I find your reasoning to be circular.
 
  • #348
But could we not consider a variant of the cat paradox where a brick sits on a trap door, and falls its full height if a nucleus decays? Then decoherence would make it such that we never observe the brick in a superposition, but the two possibilities do still occur in experiments, so we do get a large standard deviation in the brick's location.
 
  • #349
stevendaryl said:
You want to launch into the use of density matrices of a particular form that only make sense under the assumption that you're trying to prove.
It is legitimate to start with different basic assumptions on which to erect the edifice of quantum mechanics. The only condition is that the basic assumptions are consistent with experiment. Everything else is a matter of choice, and the quality of the choice is measured by the conclusions one can draw from it and how well they fit the real world.

You start with the traditional textbook assumptions and get into all the trouble with meaningless superpositions of macroscopic objects, for which nobody has been able to give a meaning in reality. Note that the superposition principle is already known to be inconsistent with physics as it leads to immediate contradiction with rotations when you superimpose a spin 0 and a spin 1/2 state. (Try to rotate by ##2\pi## and observe what happens to inner products of two arbitrary such superpositions.)

stevendaryl said:
I find your reasoning to be circular.

I start with the algebraic approach to quantum mechanics where quantities are functions of elements of a C^* algebra (e.g. the algebra of linear operators on a Schwartz space, which encodes Dirac's bra-ket setting) and states are positive linear operators - the natural analogue of what one has in classical stochastic physics. This is a far better starting point than the unrealistic textbook axioms used in introductory textbooks. Nothing is circular in this approach.

In the algebraic approach there is no superposition principle, and it naturally accounts for superselection sectors such as that for integral/half-integral spin. Moreover, it gives a far simpler approach to statistical mechanics compared to the standard approach. Finally, and most importantly, it leads to exactly the same predictions as the shut-up-and-calculate part of quantum mechanics and hence is a fully trustworthy foundation.

So my approach cannot be proved wrong, while the superposition principle is proved wrong by the existence of spin 1/2.
 
Last edited:
  • Like
Likes vanhees71
  • #350
stevendaryl said:
The law of large numbers is not sufficient to explain this effect.
If ##A_1,\ldots,A_N## are uncorrelated operators with the same standard deviation ##\sigma## then ##X:=N^{-1}(A_1+\ldots A_N)## has standard deviation ##N^{-1/2}\sigma##, as a simple calculation reveals. The arguments in statistical mechanics are similar, except that they account (in many important instances) for the typical correlations between the ##A_k##.

Please point out where the argument is faulty. If successful, all books on statistical mechanics must be rewritten to account for your revolutionary insight.
 
Last edited:
  • Like
Likes vanhees71
Back
Top