Assumptions of the Bell theorem

In summary: In fact, the whole point of doing so is to get rid of the probabilistic aspects.The aim of this thread is to make a list of all these additional assumptions that are necessary to prove the Bell theorem. An additional aim is to make the list of assumptions that are used in some but not all versions of the theorem, so are not really necessary.The list of necessary and unnecessary assumptions is preliminary, so I invite others to supplement and correct the list.
  • #491
Demystifier said:
Coarse graining indeed explains how the interference terms disappear and how the evolution becomes irreversible. However, and this is the crucial point, it does not explain how the measurement results appear.
I wasn't really discussing decoherence. Coarse-grained macroscopic DOFs also lack interference due to effects which dominate decoherence.

However regardless, I would say that of course quantum theory doesn't say which specific result arises because it is a probabilistic theory. This isn't really an inconsistency though.
 
  • Like
Likes vanhees71
Physics news on Phys.org
  • #492
Kolmo said:
I wasn't really discussing decoherence. Coarse-grained macroscopic DOFs also lack interference due to effects which dominate decoherence.
In this case I have no idea how is that supposed to explain measurement results, i.e. definite measurement outcomes.

Kolmo said:
However regardless, I would say that of course quantum theory doesn't say which specific result arises because it is a probabilistic theory. This isn't really an inconsistency though.
I don't have a problem with that. There are, for instance, stochastic versions of "Bohmian" interpretation, which are fundamentally probabilistic too, and I'm fine with it. The question is not which specific result arises. The question is why only one specific result arises. If you don't adopt some specific interpretation (Bohm, many worlds, fundamental collapse, ...) then the answer is not clear. If you just postulate it, without an attempt to explain it from something deeper, then there are problems which I can only discuss if you specify some details of this postulate.
 
  • Like
Likes physika
  • #493
Demystifier said:
In this case I have no idea how is that supposed to explain measurement results, i.e. definite measurement outcomes.
It's just that there are effects other than decoherence that suppress interference. Decoherence isn't even the one that was first discovered.

Demystifier said:
The question is not which specific result arises. The question is why only one specific result arises
I find this question hard to understand. If you end up with a classical probability distribution over the outcomes, then you know an outcome occurred but not which one. QM will only tell you each outcome's probability.
 
  • Like
Likes vanhees71
  • #494
Demystifier said:
The question is not which specific result arises. The question is why only one specific result arises.
What is an example where more than one result arises?
 
  • Like
Likes vanhees71
  • #495
Kolmo said:
I find this question hard to understand. If you end up with a classical probability distribution over the outcomes, then you know an outcome occurred but not which one.
It's a bit subtle to explain where is the problem. To get this classical probability, you first need to do the coarse graining. And to do the coarse graining, you first need to decide which degrees of freedom are irrelevant. The problem is that this decision is arbitrary, subjective and athropomorphic. For instance, who or what makes such a decision in the absence of conscious beings? How the nature itself, which by definition is the whole nature (not its open part), can do the coarse graining? If two agents define coarse graining differently, does it imply that for one agent the outcome occurs and for the other the same outcome does not occur? This leads to various Wigner-friend type of paradoxes.
 
  • #496
martinbn said:
What is an example where more than one result arises?
Example is the many world interpretation. (But that's probably not what you meant by "example".)
 
  • #497
Demystifier said:
It's a bit subtle to explain where is the problem. To get this classical probability, you first need to do the coarse graining. And to do the coarse graining, you first need to decide which degrees of freedom are irrelevant. The problem is that this decision is arbitrary, subjective and athropomorphic. For instance, who or what makes such a decision in the absence of conscious beings? How the nature itself, which by definition is the whole nature (not its open part), can do the coarse graining? If two agents define coarse graining differently, does it imply that for one agent the outcome occurs and for the other the same outcome does not occur? This leads to various Wigner-friend type of paradoxes.
The decision which are the relevant macroscopic observables is not arbitrary but simply specified by the system under consideration, and it's of course subject to experimental test. If you choose the wrong observables in your theoretical description, you'll not be successful describing the phenomenon observed.

Of course it can also be that you have different resolution. Compare the CMBR measurement by the COBE satellite with that by WMAP and PLANCK, each with higher resolution ("less coarse graining") than the previous one. The resolution, "coarseness of the coarse graining", is not arbitrary or subjective but given by the macroscopic system (in this case a measurement device).
 
  • Like
Likes Kolmo
  • #498
Demystifier said:
To get this classical probability, you first need to do the coarse graining
If two agents define coarse graining differently, does it imply that for one agent the outcome occurs and for the other the same outcome does not occur?
There's not really any ambiguity here. Macroscopic observables are represented by well defined sums of individual atomic observables (in general sums of products) resulting in an aggregate operator. In measurement models it's a lot of work but you can show that the DOFs associated to such observables don't have interference and irreversibly store a result. Older papers show that experiments to demonstrate interference for such functional sum observables necessarily "melt" the device back into an atomic soup.

The functional sum of atomic observables giving the macroscopic pointer DOF is well-defined given the particular measurement set up. It's not some random free for all with every theoretician permitted to do whatever they want.
 
  • Like
Likes vanhees71
  • #499
vanhees71 said:
The resolution, "coarseness of the coarse graining", is not arbitrary or subjective but given by the macroscopic system (in this case a measurement device).
Let us try to understand it in an example. In a system of ##10^{23}## atoms, which of them are "small open system" and which are "macroscopic environment"?
 
  • #500
Kolmo said:
There's not really any ambiguity here. Macroscopic observables are represented by well defined sums of individual atomic observables (in general sums of products) resulting in an aggregate operator.
There is an ambiguity. Show me a concrete example of such a "well defined" sum and I will explain why is it ambiguous.
 
  • #501
Demystifier said:
Let is try to understand it in an example. In a system of ##10^{23}## atoms, which of them are "small open system" and which are "macroscopic environment"?
I think you have the picture wrong.

Say we have a ##n##-site operator ##a(q_{1}, \ldots , q_{n})## which is some operator over the Hilbert space ##\mathcal{H}_{1...n}## of these ##n##-particles. We can then form macroscopic operators via sums like:
$$A = \frac{1}{C}\sum_{f} a(f)$$
Where ##f## is one of the collections of ##n## particles defining ##a## and we perform the sum over all such partitions of ##n##-particles. Certain checks with relativity, material physics, etc can in addition show that a perfectly fine grained measurement of ##A## is not possible, i.e. not physically possible to distinguish all of its eigenvalues, so for a realistic model you replace ##A## with ##\bar{A}##.

One can then show that ##\bar{A}## doesn't display interference terms since it commutes with all other such macro-observables ##\bar{B}## and microscopic observables. A very early simple proof is given in the first edition of Gottfried's text.

These macro-observables are then your pointer variables. Each one is given by a particular well-defined sum. Seems clear to me.

It's also just one particular method for showing this.
 
  • Like
Likes vanhees71
  • #502
Kolmo said:
These macro-observables are then your pointer variables. Each one is given by a particular well-defined sum. Seems clear to me.
With the theory you presented, can you explain why typical macro pointers don't distinguish a cat in the state ##|dead\rangle+|alive\rangle## from the cat in the state ##|dead\rangle-|alive\rangle##?
 
  • #503
Demystifier said:
With the theory you presented, can you explain why typical macro pointers don't distinguish a cat in the state ##|dead\rangle+|alive\rangle## from the cat in the state ##|dead\rangle-|alive\rangle##?
Yes, although it's of course just one of a few methods. Essentially there's no physical observable which fails to commute with aggregate observables like "alive" and "dead", taking them here as shorthand for more well-defined macro-quantities.

A coupling which would attempt to measure such an observable would "melt" the cat into soup via the couplings that enact it. So leaving the cat as a macroscopic body it has pointer variables.
 
  • Like
Likes vanhees71
  • #504
Kolmo said:
Essentially there's no physical observable which fails to commute with aggregate observables like "alive" and "dead",
Why? How do you know that there is no such observable?

Kolmo said:
A coupling which would attempt to measure such an observable would "melt" the cat into soup via the couplings that enact it. So leaving the cat as a macroscopic body it has pointer variables.
I don't understand the second sentence. A macroscopic body melt into a soup is still a macroscopic body. A body does not need to be solid, a liquid is a body too. I know you were using a metaphor, but I don't understand the metaphor.
 
  • #505
Demystifier said:
Because the open system is a subsystem of the full closed system. Hence the properties of the open system can be derived from the properties of the closed system, and not the other way around.

Demystifier said:
Great, we finally agree that a closed system cannot resolve the measurement problem. What we disagree is that you think that an open system (which, in my understanding, is a subsystem of the full closed system) can resolve it.

Demystifier said:
I agree that there is no empirical evidence, but I think there is a logical evidence. It's the logic that if something cannot be explained by considering the full closed system (on which we agree), then it also cannot be explained by considering its open subsystem.
But you do know why a (open) subsystem provides additional structure not present in the full closed system alone. Even that additional structure is still not enough to resolve the measurement problem. But that additional structure is implicitly present in many arguments, so highlighting the importance of open systems for the measurement problem seems reasonable. (And it makes sense to me, because Heisenberg and other founders also stressed its importance.)
 
  • Like
Likes physika and Demystifier
  • #506
Demystifier said:
Why? How do you know that there is no such observable?
It can be proven, but it's a long argument. There are shorter arguments if one uses the abstract C*-formalism, but they sacrifice ease for brevity.

Demystifier said:
I don't understand the second sentence. A macroscopic body melt into a soup is still a macroscopic body. I know you were using a metaphor, but I don't understand the metaphor.
If you try to measure observables that don't commute with ##\bar{A}## then you necessarily reduce the device to a disperse plasma of individual atoms and subatomic particles which is not normally referred to as a macroscopic body or a device.

Any observable ##Q## obeying ##[Q,\bar{A}] \neq 0## is not compatible with the device remaining as a solid stable composite body.

Now there are old arguments from Ludwig in:
G. Ludwig: “Die Grundlagen der Quantenmechanik”, Springer, Berlin 19541
that such ##Q## in most cases probably can't be performed at all as the coupling Hamiltonians needed to enact them aren't physical at all. Gottfried says similar in his book in Sections 18-20 of his old text.
There are similar parallel arguments if one follows other approaches to measurement such as decoherence or the more abstract treatments with C*-algebras.

Either way it's clear that a measurement gives rise to irreversible storage of an outcome with the pointer variables being well-defined expressions and no real breakdown of quantum theory. The only novelty from the classical case is the fundamental probabalism of the outcomes and that we cannot ascribe a well-defined value to those quantities left unmeasured (which is the non-Kolmogorov nature of QM mentioned earlier).

1I learned to read German just to read this so I have fond memories of it!
 
  • Like
Likes vanhees71 and dextercioby
  • #507
Demystifier said:
Of course, I meant closed system including the measuring apparatus and the environment.
Which means that, while in principle yes, all the information is in that system, in practice most of that information is inaccessible to us. We certainly can't do quantum state tomography "from the outside" on a whole ensemble of identically prepared system + measuring apparatus + environment in order to find out exactly which pure state is being prepared.

Also, considering this closed system doesn't solve the measurement problem either, since this closed system should just undergo unitary evolution all the time, since it is not interacting with anything, and therefore we end up at something like the MWI. I see that this is more or less where you ended up in your exchange with @vanhees71.
 
  • Like
Likes vanhees71
  • #508
Trying to think of a way to phrase this, but if that unitary evolution, given the observable algebra of the device, results in a classical probability distribution over the observed outcomes isn't that all you need. Each term can just be read off as a probabilistic weighting of given values of the pointer variable(s). It's not deterministic, but I don't see the issue.
 
  • Like
Likes vanhees71
  • #509
Kolmo said:
Any observable ##Q## obeying ##[Q,\bar{A}] \neq 0## is not compatible with the device remaining as a solid stable composite body.
Without going into detailed mathematical proofs which is not my main point, would you agree that the reason for this, is that your axiomatic framework considers PERFECTLY optimal inferences only?

Ie. let's say the information theoretic summary is that, "We show that QM is the optimal inference theory of a "classical agent" (given certain conditions). Then the above conclusion does not allow that agents are allowed to have some variation AROUND the optimal value.

This seems unnatural to me, and too strong assumption from a perspective where you consider agents to evolve, and require the existence of variation, and it you relax this a bit, then the observables are not impossible, just corresponding to "unstable agents", or equivalently impying that if such a crazy agent would be backed up, it would act desctructively towards it's own environment.

Does this informal summary make sense to you, or would you disagree to it?

I certainly haven't read that book but "probably can't be performed at all as the coupling Hamiltonians needed to enact them aren't physical at all" sounds like meaning just the above?

/Fredrik
 
  • #510
Fra said:
Without going into detailed mathematical proofs which is not my main point, would you agree that the reason for this, is that your axiomatic framework considers PERFECTLY optimal inferences only?
No. Models of measurement equally model POVMs, Weak Measurements and so on. The Curie-Weiss model of measurement which is the default "very detailed" measurement model naturally produces POVMs. So it's not restricted to optimal measurements.
 
  • #511
PeterDonis said:
Which means that, while in principle yes, all the information is in that system, in practice most of that information is inaccessible to us. We certainly can't do quantum state tomography "from the outside" on a whole ensemble of identically prepared system + measuring apparatus + environment in order to find out exactly which pure state is being prepared.

Also, considering this closed system doesn't solve the measurement problem either, since this closed system should just undergo unitary evolution all the time, since it is not interacting with anything, and therefore we end up at something like the MWI. I see that this is more or less where you ended up in your exchange with @vanhees71.
In addition we cannot even prepare a macroscopic system in exactly the same microscopic (pure) state to begin with. All we can do and all we need to do is to prepare the macroscopic system with sufficient accuracy by determining its macroscopic (relevant) observables with sufficient accuracy.
 
  • Like
Likes WernerQH and Kolmo
  • #512
Demystifier said:
With the theory you presented, can you explain why typical macro pointers don't distinguish a cat in the state ##|dead\rangle+|alive\rangle## from the cat in the state ##|dead\rangle-|alive\rangle##?
I think this is a blatant misuse of notation. Had Schrödinger replaced the poor cat by a calorimeter, could you even conceive of a coherent superposition of two states |14°C> and |15°C> ? For a tiny drop of ## 1 {\rm mm}^3 ## water one finds an entropy increase ## S/k = {\rm ln} W ## of about ## 10^{18} ##, and that's just the logarithm of the tremendous factor by which the 15°C states are more numerous than the 14°C states. One frequently sees these fictitious kets "correctly" normalized with a factor ## 1/\sqrt 2 ##, but this normalization looks strikingly suspicious. You have wisely refrained from adding this factor, but I still find it hard to see how the two states could have equal weight in a superposition. I think it's meaningless to represent such states by any kind of wave function.
 
  • Skeptical
Likes Demystifier
  • #513
WernerQH said:
I think it's meaningless to represent such states by any kind of wave function
As mentioned above by vanhees71 and as seen in many models of measurements, the states of the device are of course in fact high-entropy mixed states, usually constructed by maxent methods or similar. The temperature example you give being a clear case, as a state of some temperature ##T## is a Gibb's state and not a pure state. To say nothing of other macroscopic quantities entering into the definition of the macrostate.
 
  • Like
Likes vanhees71
  • #514
Kolmo said:
No. Models of measurement equally model POVMs, Weak Measurements and so on. The Curie-Weiss model of measurement which is the default "very detailed" measurement model naturally produces POVMs. So it's not restricted to optimal measurements.
Mmmm... thanks.

Does this argument include also the defintion of "weak measurements" that can give large disturbances on the observed system on the measured system, but at low probability (so that it "on average" is still weak)? instead of what i think is more common, that ALL outcomes give small disturbance? and extremal variability is truncated.I suspect this may influence what is meany by compatibibility. I.e are incompatibilities at low probabilities allowed?

/Fredrik
 
  • #515
WernerQH said:
I think this is a blatant misuse of notation. Had Schrödinger replaced the poor cat by a calorimeter, could you even conceive of a coherent superposition of two states |14°C> and |15°C> ? For a tiny drop of ## 1 {\rm mm}^3 ## water one finds an entropy increase ## S/k = {\rm ln} W ## of about ## 10^{18} ##, and that's just the logarithm of the tremendous factor by which the 15°C states are more numerous than the 14°C states. One frequently sees these fictitious kets "correctly" normalized with a factor ## 1/\sqrt 2 ##, but this normalization looks strikingly suspicious. You have wisely refrained from adding this factor, but I still find it hard to see how the two states could have equal weight in a superposition. I think it's meaningless to represent such states by any kind of wave function.
Almost every sentence here contains a conceptual error, but let me concentrate on (what seems to me) the essence of your argument. You argue that if one state has much larger entropy than the other, then the probabilities of those two states cannot be the same. But that's wrong. It would be right in a statistical equilibrium (which maximizes entropy under given constraints), but in general we don't need to have a statistical equilibrium.

For example, I can prepare a spin-1/2 particle in an eigenstate of spin in the x-direction and then measure its spin in the z-direction. If the z-spin is up, I prepare the drop in the state |14°C>. If the z-spin is down, I prepare the drop in the state |15°C>. In this way, the states |14°C> and |15°C> have the same statistical weights.
 
  • #516
Fra said:
Does this argument include also the defintion of "weak measurements"
Weak measurements are just POVMs whose Kraus operators are close to the identity and possibly followed by post-selection. So yes essentially.
 
  • Like
Likes vanhees71 and Demystifier
  • #517
Demystifier said:
I prepare the drop in the state |14°C>
It's important though that there is no such pure state, it is necessarily mixed. This is an important point in detailed models of measurement.
 
  • Like
Likes vanhees71
  • #518
PeterDonis said:
closed system ... is not interacting with anything,
It is interacting with itself.
 
  • #519
Demystifier said:
For example, I can prepare a spin-1/2 particle in an eigenstate of spin in the x-direction and then measure its spin in the z-direction. If the z-spin is up, I prepare the drop in the state |14°C>. If the z-spin is down, I prepare the drop in the state |15°C>. In this way, the states |14°C> and |15°C> have the same statistical weights.
But then you are no longer talking about unitary evolution of kets, but ordinary classical statistical physics.
 
  • #520
Kolmo said:
It's important though that there is no such pure state, it is necessarily mixed.
That's one of the reasons why I said that almost any sentence in his post contains a conceptual error. But there is a way to associate something like a temperature with a pure state, e.g. by studying how energy is distributed in this state.
 
  • #521
WernerQH said:
But then you are no longer talking about unitary evolution of kets, but ordinary classical statistical physics.
But if the spin-1/2 particle entangled with the drop are isolated from the rest of environment, then we have a coherent superposition.
 
Last edited:
  • #522
WernerQH said:
But physicists can't agree on what constitutes a "measurement".
Correct!

Maximilian Schlosshauer/1/ clearly identifies the measurement problem in the following way:

“But what exactly is the measurement problem? I have found that everyone seems to have a somewhat different conception of the affair. One way of identifying the root of the problem is to point to the apparent dual nature and description of measurement in quantum mechanics. On the one hand, measurement and its effect enter as a fundamental notion through one of the axioms of the theory. On the other hand, there’s nothing explicitly written into these axioms that would prevent us from setting aside the axiomatic notion of measurement and instead proceeding conceptually as we would do in classical physics. That is, we may model measurement as a physical interaction between two systems called “object” and “apparatus” — only that now, in lieu of particles and Newtonian trajectories, we’d be using quantum states and unitary evolution and entanglement-inducing Hamiltonians.

What we would then intuitively expect — and perhaps even demand — is that when it’s all said and done, measurement-as-axiom and measurement-as-interaction should turn out to be equivalent, mutually compatible ways of getting to the same final result. But quantum mechanics does not seem to grant us such simple pleasures. Measurement-as-axiom tells us that the post-measurement quantum state of the system will be an eigenstate of the operator corresponding to the measured observable, and that the corresponding eigenvalue represents the outcome of the measurement. Measurement-as-interaction, by contrast, leads to an entangled quantum state for the composite system-plus-apparatus. The system has been sucked into a vortex of entanglement and no longer has its own quantum state. On top of that, the entangled state fails to indicate any particular measurement outcome.

So we’re not only presented with two apparently mutually inconsistent ways of describing measurement in quantum mechanics, but each species leaves its own bad taste in our mouth. When confronted with measurement-as-axiom, many people tend to wince and ask: “But ... what counts as a measurement? Why introduce a physical process axiomatically? What makes the quantum state collapse?” And so on. But measurement-as-interaction delivers no ready-made remedy either. As we have seen, the interaction leads to nothing that would resemble the outcome of a measurement in any conventional sense of the word.”
[bold by LJ]

/1/ M. Schlosshauer (ed.), Elegance and Enigma, The Quantum Interviews, Springer-Verlag Berlin Heidelberg 2011, pp. 141-142
 
  • Like
Likes physika, Fra and Demystifier
  • #523
Demystifier said:
That's one of the reasons why I said that almost any sentence in his post contains a conceptual error. But there is a way to associate something like a temperature with a pure state, e.g. by studying how energy is distributed in this state.
In general for a selection of properties one can construct pure states that give similar results to Gibb's states, but many statistical mechanical properties won't give the same results and in general the entanglement measures are not correct.
If one is only concentrating on a small selection of properties and not too interested in dynamics then indeed there are a few calculational methods using pure states.

This isn't to disagree with anything you said, just more an "out of interest" thing.
 
  • #524
Lord Jestocost said:
Measurement-as-axiom tells us that the post-measurement quantum state of the system will be an eigenstate of the operator corresponding to the measured observable, and that the corresponding eigenvalue represents the outcome of the measurement. Measurement-as-interaction, by contrast, leads to an entangled quantum state for the composite system-plus-apparatus.
I kind of understand what is being said here but two points.

(a) In general the state afterward is not some eigenvalue of a observable, but rather updated via a Kraus operator. This isn't too important since the text might be focusing purely on von Neumann style measurements.

(b) There's nothing exceptional here that isn't in any statistical theory. I can apply the exact same arguments to the evolution of an option price under Black-Scholes. There as well "observation" of the option price does not produce the exact same answer as evolution under the Black-Scholes equations where one includes the "price evaluator" in the model.
In fact you get the exact same "issues" in a statistical model of classical particle interactions. Or even of a dice roll if one "included the measuring device".
 
  • Like
Likes dextercioby
  • #525
Demystifier said:
But if the spin-1/2 particle entangled with the drop are isolated from the rest of environment, then we have a coherent superposition.
The spin state of an electron and the thermal state of a drop are very different things. It is beyond me how you can give meaning to empty symbolism such as |dead> + |alive>.
 
  • Skeptical
Likes Motore

Similar threads

  • Quantum Interpretations and Foundations
10
Replies
333
Views
12K
  • Quantum Interpretations and Foundations
Replies
2
Views
1K
  • Quantum Interpretations and Foundations
2
Replies
37
Views
2K
  • Quantum Interpretations and Foundations
2
Replies
44
Views
2K
  • Quantum Interpretations and Foundations
Replies
6
Views
2K
  • Quantum Interpretations and Foundations
7
Replies
226
Views
19K
  • Quantum Interpretations and Foundations
6
Replies
175
Views
7K
  • Quantum Interpretations and Foundations
5
Replies
153
Views
6K
  • Quantum Interpretations and Foundations
7
Replies
228
Views
12K
  • Quantum Interpretations and Foundations
Replies
19
Views
2K
Back
Top