Collapse and Peres' Coarse Graining

  • Thread starter atyy
  • Start date
  • Tags
    Collapse
In summary, the conversation discusses the concept of collapse in quantum mechanics and its potential replacement with coarse-graining, particularly in the context of the Bell tests. Peres suggests that blurring or decoherence could serve as an alternative to collapse, but it is not explicitly stated in his work. The idea of blurring is not well-defined and it is unclear if it can be a local procedure in the Bell tests. Collapse is used as a bookkeeping device and is verified through theoretical practice, but it is not necessarily a physical phenomenon.
  • #36
Of course, you must make the assumption that your preparation procedure is sufficiently accurate to prepare a state and that the state-preparation procedure is reproducible. That's of course the assumption underlying all physics. If this was not the case, then there simply wouldn't be physics as we know it. I've given an example concerning the preparation through an ideal-filter measurement, using the Stern-Gerlach experiment as the most simple example. There it's just about to look only at particles in a region of space, where you can be sure to have well-determined spin-z components, and you can show with quantum dynamics that it is possible to build a Stern-Gerlach apparatus that does precisely this with as high an accuracy you like. Nowhere do I have to envoke classical approximations or even a collapse to "filter out" particles with a definite spin-z component.

However, I think our discussion runs in circles right now, and we cannot come to a conclusion, because we are not discussing hard-science issues but philosophical opinions :-(.
 
Physics news on Phys.org
  • #37
atyy said:
I am trying to understand vanhees71's anti-collapse view.
Well he is clearly anti-physical collapse(so am I), and that is logical if you think the wave function is not real but I think we can agree if we are not talking about that kind of collapse, it is basically a terminology issue, no?
Or you mean by collapse something else?, if you think it only makes sense after each measurement you are giving the wave function an ontologic interpretation that is at odds with any ensemble view.

I think the minimal interpretation is the most coherent both with what observations allow and internally if one is convinced that quantum theory is complete (within the bounds that scientific theories can be complete). I'm afraid the rest reflect a deep dissatisfaction with the theory and/or a lack of boldness to admit to themselves or to go out and say it is not complete and find a new and better theory, so they take shelter in the other many more contrived and different interpretations.
Ironically the minimal interpretation seems also a good starting point if one honestly thinks the theory is incomplete.
 
  • #38
vanhees71 said:
However, I think our discussion runs in circles right now, and we cannot come to a conclusion, because we are not discussing hard-science issues but philosophical opinions :-(.
That is true. But I would also like to add that one of the branches of philosophy is logic, and to me, results obtained by logic are even more certain than those obtained by science. Therefore, when I am philosophical, I usually try to be logical. (Which should not be surprising to those who can recognize the person drawn in my avatar. ;) )
 
Last edited:
  • Like
Likes atyy
  • #39
vanhees71 said:
Of course, you must make the assumption that your preparation procedure is sufficiently accurate to prepare a state and that the state-preparation procedure is reproducible. That's of course the assumption underlying all physics. If this was not the case, then there simply wouldn't be physics as we know it. I've given an example concerning the preparation through an ideal-filter measurement, using the Stern-Gerlach experiment as the most simple example. There it's just about to look only at particles in a region of space, where you can be sure to have well-determined spin-z components, and you can show with quantum dynamics that it is possible to build a Stern-Gerlach apparatus that does precisely this with as high an accuracy you like. Nowhere do I have to envoke classical approximations or even a collapse to "filter out" particles with a definite spin-z component.

I don't believe you can do this without a collapse postulate, or something beyond {unitary evolution of the quantum state + Born rule without collapse}.

vanhees71 said:
However, I think our discussion runs in circles right now, and we cannot come to a conclusion, because we are not discussing hard-science issues but philosophical opinions :-(.

We are discussing hard science, because we are discussing whether a mathematical statement is part or not part of the postulates of quantum mechanics, and as far as I know this postulate is essential and found in almost all standard texts. We are certainly not discussing whether collapse is physical or not.
 
  • #40
I've given my argument, and I don't know how to express it differently. I still stand to my argumentation that for the Stern-Gerlach experiment I don't need any collapse but simply unitary time evolution (of both states and operators representing observables depending on the picture of time evolution I choose; the physics is of course independent of this choice) and the standard postulates of quantum theory (including Born's rule).
 
  • #41
Demystifier said:
That is true. But I would also like to add that one of the branches of philosophy is logic, and to me, results obtained by logic are even more certain than those obtained by science. Therefore, when I am philosophical, I usually try to be logical. (Which should not be surprising to those who can recognize the person drawn in my avatar. ;) )
He is BR of course, but I didn't realized it til the logical clue.
 
  • Like
Likes atyy and Demystifier
  • #42
Demystifier said:
That is true. But I would also like to add that one of the branches of philosophy is logic, and to me, results obtained by logic are even more certain than those obtained by science. Therefore, when I am philosophical, I usually try to be logical. (Which should not be surprising to those who can recognize the person drawn in my avatar. ;) )
Hm, many mathematicians are so embarrassed mathematics being called part of philosophy that they invented a third category of sciences: natural science, humanities (including philosophy), and "structural sciences" (including logic, math, and informatics/computer science).

I guess your Avatar is Russell, right?
 
  • Like
Likes atyy and Demystifier
  • #43
TrickyDicky said:
Well he is clearly anti-physical collapse(so am I), and that is logical if you think the wave function is not real but I think we can agree if we are not talking about that kind of collapse, it is basically a terminology issue, no?
Or you mean by collapse something else?, if you think it only makes sense after each measurement you are giving the wave function an ontologic interpretation that is at odds with any ensemble view.

We are not discussing terminology. vanhees71 and I have agreed that collapse is not necessarily physical. Collapse is found in almost all standard texts. As far as I know, collapse or an equivalent postulate must be added to {unitary evolution + Born rule without collapse}. I think vanhees71 is saying that collapse can be derived from {unitary evolution + Born rule without collapse}, but I don't believe this is true. I am open to the argument that one could do it in a Many-Worlds approach, but as far as I can tell, that is not what vanhees71 is doing. But if vanhees71 means that the minimal interpretation is MWI, then I can agree that what he is saying is reasonable.
 
  • #44
vanhees71 said:
I've given my argument, and I don't know how to express it differently. I still stand to my argumentation that for the Stern-Gerlach experiment I don't need any collapse but simply unitary time evolution (of both states and operators representing observables depending on the picture of time evolution I choose; the physics is of course independent of this choice) and the standard postulates of quantum theory (including Born's rule).
There is maybe a caveat with the SG experiment(and with optical polarization exp. for that matter), and it has to do with spin 2-dimensionality. It's much harder to show it for other kind of observables
 
  • #45
atyy said:
We are not discussing terminology. vanhees71 and I have agreed that collapse is not necessarily physical. Collapse is found in almost all standard texts. As far as I know, collapse or an equivalent postulate must be added to {unitary evolution + Born rule without collapse}. I think vanhees71 is saying that collapse can be derived from {unitary evolution + Born rule without collapse}, but I don't believe this is true. I am open to the argument that one could do it in a Many-Worlds approach, but as far as I can tell, that is not what vanhees71 is doing. But if vanhees71 means that the minimal interpretation is MWI, then I can agree that what he is saying is reasonable.
I DON'T say that collapse can derived from anything. I say it doesn't exist and it's unnecessary to assume it in the first place. It's just a short-hand slang to describe that we have gained knowledge from a measurement without any significance as a physical process.
 
  • #46
TrickyDicky said:
There is maybe a caveat with the SG experiment(and with optical polarization exp. for that matter), and it has to do with spin 2-dimensionality. It's much harder to show it for other kind of observables
In which sense? I can use an atom with higher spin then 1/2. Then the dimensionality is larger than 2. The only difference is that then I have more "partial beams" after the SG apparatus than 2.
 
  • #47
vanhees71 said:
I DON'T say that collapse can derived from anything. I say it doesn't exist and it's unnecessary to assume it in the first place. It's just a short-hand slang to describe that we have gained knowledge from a measurement without any significance as a physical process.

Collapse is the statement of the Born rule in the form ##P(\phi) = |\langle \phi | \psi \rangle|^{2}##, which is what happens in a filtering measurement. This is Eq 2.1.7 in volume 1 of Weinberg's QFT text. I think you agreed that measurement can be used as a means of state preparation, so in that sense I thought you said that collapse exists. If collapse does not exist, then are you saying that Weinberg's Eq 2.1.7 does not exist? If collapse does exist, then it seems you are saying that collapse can be derived, ie. Weinberg's Eq 2.1.7 can be derived.
 
  • #48
atyy said:
Collapse is the statement of the Born rule in the form ##P(\phi) = |\langle \phi | \psi \rangle|^{2}##, which is what happens in a filtering measurement. This is Eq 2.1.7 in volume 1 of Weinberg's QFT text. I think you agreed that measurement can be used as a means of state preparation, so in that sense I thought you said that collapse exists. If collapse does not exist, then are you saying that Weinberg's Eq 2.1.7 does not exist? If collapse does exist, then it seems you are saying that collapse can be derived, ie. Weinberg's Eq 2.1.7 can be derived.

I tend to think you're right about that.

It's possible that an assumption equivalent to collapse might be avoided by slightly altering the minimal interpretation. Rather than saying that QM amplitudes give the probabilities (when squared) for observations given initial states, you generalize it to relative probabilities for entire histories of observations. So you ask questions along the lines of "What is the probability of observing
[itex]\hat{O}_{n+1} = \lambda_{n+1}[/itex] at time [itex]t_{n+1}[/itex], [itex]\hat{O}_{n+2} = \lambda_{n+2}[/itex] at time [itex]t_{n+2}[/itex] ... [itex]\hat{O}_{n+m} = \lambda_{n+m}[/itex] at time [itex]t_{n+m}[/itex]
given that
[itex]\hat{O}_{1} = \lambda_{1}[/itex] at time [itex]t_{1}[/itex], [itex]\hat{O}_{2} = \lambda_{2}[/itex] at time [itex]t_{2}[/itex] ... [itex]\hat{O}_{n} = \lambda_{n}[/itex] at time [itex]t_{n}[/itex]?
 
  • #49
stevendaryl said:
I tend to think you're right about that.

It's possible that an assumption equivalent to collapse might be avoided by slightly altering the minimal interpretation. Rather than saying that QM amplitudes give the probabilities (when squared) for observations given initial states, you generalize it to relative probabilities for entire histories of observations. So you ask questions along the lines of "What is the probability of observing
[itex]\hat{O}_{n+1} = \lambda_{n+1}[/itex] at time [itex]t_{n+1}[/itex], [itex]\hat{O}_{n+2} = \lambda_{n+2}[/itex] at time [itex]t_{n+2}[/itex] ... [itex]\hat{O}_{n+m} = \lambda_{n+m}[/itex] at time [itex]t_{n+m}[/itex]
given that
[itex]\hat{O}_{1} = \lambda_{1}[/itex] at time [itex]t_{1}[/itex], [itex]\hat{O}_{2} = \lambda_{2}[/itex] at time [itex]t_{2}[/itex] ... [itex]\hat{O}_{n} = \lambda_{n}[/itex] at time [itex]t_{n}[/itex]?

Yes, collapse can be avoided in a "minimal interpretation", but it still needs a postulate beyond {unitary evolution + Born rule without collapse}. For example, a simple form of the postulate that is needed, corresponding to what you are saying, is given by Laloe in Eq 37 of http://arxiv.org/abs/quant-ph/0209123. However, Laloe makes it clear that this has to be postulated, and is clearly a generalized form of the Born rule without collapse.

If one rejects collapse, then one must also reject the Schroedinger picture, and the idea that measurement can be used as state preparation. (That's fine. However, both are part of standard quantum mechanics, and I just want to see this rejection stated explicitly. For example, Ballentine rejects collapse, but he does explicitly postulate collapse in his textbook in Eq 9.28, and uses the Schroedinger picture, and states that filtering measurements are a form of state preparation. Ballentine's Eq 9.30 in his textbook is essentially the same as Laloe's Eq 37, but Ballentine's treatment is wrong because of his inconsistent assumptions, whereas Laloe does get the statement of the assumptions correct.)

Also, Laloe points out that rejecting collapse does not save this form of "minimal interpretation", which Laloe calls the "correlation interpretation", from nonlocality.
 
Last edited:
  • #50
atyy said:
Yes, collapse can be avoided in a "minimal interpretation", but it still needs a postulate beyond {unitary evolution + Born rule without collapse}. For example, a simple form of the postulate that is needed, corresponding to what you are saying, is given by Laloe in Eq 37 of http://arxiv.org/abs/quant-ph/0209123. However, Laloe makes it clear that this has to be postulated, and is clearly a generalized form of the Born rule without collapse.

Actually, in that equation, there is still a [itex]\rho(t_0)[/itex] reflecting the initial preparation, but maybe that can be replaced by an initial observation?
 
  • #51
Isnt' the collapse issue just another name for the state preparation vs. observable measurement difference? I mean, states should be simple bookkeeping devices which have a probabilistic interpretation, once one goes from the abstract (rigged) Hilbert space to lab experiments.
How would one go about explaining to an experimentalist that quantum mechanics is essentially a statistical theory? I've learned QM from a mixture of (the so-called) orthodox Copenhagen formulation (Born rule separated from SEq separated from von Neumann's state reduction after measurement) and the virtual statistical ensemble approach, which used numerical probabilities for results of experiments done on an infinite number of (tricky issue coming!) "identically prepared real quantum systems". How would you reconcile the von Neumann state reduction with the virtual statistical ensemble?
 
  • #52
dextercioby said:
Isnt' the collapse issue just another name for the state preparation vs. observable measurement difference? I mean, states should be simple bookkeeping devices which have a probabilistic interpretation, once one goes from the abstract (rigged) Hilbert space to lab experiments.
How would one go about explaining to an experimentalist that quantum mechanics is essentially a statistical theory? I've learned QM from a mixture of (the so-called) orthodox Copenhagen formulation (Born rule separated from SEq separated from von Neumann's state reduction after measurement) and the virtual statistical ensemble approach, which used numerical probabilities for results of experiments done on an infinite number of (tricky issue coming!) "identically prepared real quantum systems". How would you reconcile the von Neumann state reduction with the virtual statistical ensemble?

I don't understand why there is any need to reconcile state reduction with the virtual statistical ensemble.

atty's point, which seems valid to me, is that the whole idea that a preparation procedure produces a system in a known state implies collapse of a sort. The alternative view (which I guess is equivalent) is instead of the orthodox view of preparing the system in an initial state, and then later measuring some observable, you could just talk about relative probabilities of histories of observations, which doesn't explicitly involve preparation or collapse.
 
  • #53
OK, then how about the conflict between the Schrödinger's equation for a time evolution of states and the reduction postulate which necessary involves a time evolution of the state, too.
 
  • #54
stevendaryl said:
Actually, in that equation, there is still a [itex]\rho(t_0)[/itex] reflecting the initial preparation, but maybe that can be replaced by an initial observation?

Yes, there is still the initial state, which doesn't evolve in this form of Heisenberg picture. It could be linked to the initial observation, but it would be cumbersome, since the state represents an equivalence class of different classical operations which we call preparations. I can accept that the presence of the initial quantum state is not a form of collapse, in the sense that collapse links preparations and measurement, and says that measurement can result in two outcomes: a classical outcome and a quantum state, and that the classical outcome indicates the quantum state, so both are given by the same Born rule. For consistency, rejecting collapse means that the Schroedinger picture is invalid, and that measurement cannot be used as state preparation, both of which are contrary to standard quantum mechanics, but I am willling to accept that the view is at least consistent (and thus a plausible interpretation).
 
  • #55
dextercioby said:
Isnt' the collapse issue just another name for the state preparation vs. observable measurement difference?

In the orthodox Copenhagen interpretation, measurement can be used as a means of state preparation. A measurement can potentially have two outcomes: a classical outcome which is the reading of the apparatus, and a quantum state. The collapse postulate says that the quantum outcome and the classical reading are linked, and both are given by the Born rule.

So not all preparations result from measurement, but some preparations can result from measurement.

dextercioby said:
I mean, states should be simple bookkeeping devices which have a probabilistic interpretation, once one goes from the abstract (rigged) Hilbert space to lab experiments.

Yes, in the orthodox Copenhagen interpretation quantum states are just book keeping devices, as is unitary evolution and collapse

dextercioby said:
How would one go about explaining to an experimentalist that quantum mechanics is essentially a statistical theory?

In the orthodox Copenhagen interpretation, quantum mechanics only makes statistical predictions by postulation. Thus for a given initial quantum state, the classical and quantum outcomes of a measurement are probabilistic and given by the Born rule.

dextercioby said:
I've learned QM from a mixture of (the so-called) orthodox Copenhagen formulation (Born rule separated from SEq separated from von Neumann's state reduction after measurement) and the virtual statistical ensemble approach, which used numerical probabilities for results of experiments done on an infinite number of (tricky issue coming!) "identically prepared real quantum systems". How would you reconcile the von Neumann state reduction with the virtual statistical ensemble?

In the orthodox Copenhagen interpretation, we can label each member of the virtual ensemble by a pure quantum state: the virtual ensemble and the quantum state are both book keeping devices.

dextercioby said:
OK, then how about the conflict between the Schrödinger's equation for a time evolution of states and the reduction postulate which necessary involves a time evolution of the state, too.

In the orthodox Copenhagen interpretation, we have to divide the world into a classical portion and a quantum portion. This is subjective, but for all practical purposes, we do know what a classical measurement apparatus is, and we can time stamp our observations (as is done in experimental Bell tests). Since we know when measurements occur for all practical purposes, we can also deal with the unitary evolution between measurements, and the non-unitary evolution that occurs when a measurement is made. This division is not absolute, but each user of quantum theory must make this division. In the orthodox Copenhagen interpretation, we do not know whether there is any meaning to the "wave function of the universe".
 
Last edited:
  • #56
atyy said:
We are not discussing terminology. vanhees71 and I have agreed that collapse is not necessarily physical. Collapse is found in almost all standard texts. As far as I know, collapse or an equivalent postulate must be added to {unitary evolution + Born rule without collapse}. I think vanhees71 is saying that collapse can be derived from {unitary evolution + Born rule without collapse}, but I don't believe this is true.
But what would be your objection to having epistemic collapse implicit in the ensemble concept plus the matter of fact of performing and using measurements, instead of explicitly in the form of postulate? Your initial question seemed to admit the possibility that explicit collapse could be replaced with coarse-graining(implicit collapse) making the explicit postulate unnecessary.
To me even if such a view only expressed explicitly unitary evolution+Born rule, it would be including the non-unitary phase of evolution as well. I might very well be missing some subtlety and that's why I ask.
 
  • #57
TrickyDicky said:
But what would be your objection to having epistemic collapse implicit in the ensemble concept plus the matter of fact of performing and using measurements, instead of explicitly in the form of postulate? Your initial question seemed to admit the possibility that explicit collapse could be replaced with coarse-graining(implicit collapse) making the explicit postulate unnecessary.
To me even if such a view only expressed explicitly unitary evolution+Born rule, it would be including the non-unitary phase of evolution as well. I might very well be missing some subtlety and that's why I ask.

Demystifier pointed out that the coarse-graining in Peres's view only acts on the apparatus, not on the quantum system. Hence for the quantum system, the coarse-graining (as far as we can understand Peres) is only equivalent to decoherence. Decoherence does not do away with the need for collapse, because the {system+apparatus+environment} is still in a pure state, which presumably corresponds to an "ensemble". In order to have sub-ensembles, the individual members of the ensemble must be labelled with different labels (eg. an ensemble of identical balls has no natural sub-ensembles, but a mixture of red and green balls is an ensemble with natural sub-ensembles). If the only label that the individual members of the ensemble have is the pure state, then there are no natural sub-ensembles. If there is a label that is not the pure state, that label is a hidden variable, which is an additional postulate.

Another problem is that decoherence is not perfect. But let's suppose decoherence is perfect, in which case it can be argued that decoherence does pick a preferred basis, and thus picks natural sub-ensembles. But this would mean that an ensemble with no sub-ensembles is suddenly divisible into sub-ensembles at the moment of perfect decoherence. This sudden appearance of sub-ensembles is equivalent to collapse. It is clear that this sudden appearance of sub-ensembles needs an additional postulate.

The basic way to see that an additional postulate is needed is that a measurement potentially has two outcomes - a classical reading of the apparatus and a quantum state. If the Born rule has to be specified as a postulate for the classical reading, then it also has to be specified as a postulate for the quantum state. There are of course other postulates that are equivalent, such as noncontextuality for the measurement outcomes, from which the Born rule can be derived via Gleason's theorem. Still the noncontextuality has to be specified as an additional postulate for the classical and quantum outcomes of the measurement.

In any case, I am willing to consider coarse-graining as an additional postulate. However, it is only discussed vaguely in one book, and as far as I can tell, Peres does not specify well enough how the coarse-graining is done that it can replace collapse. So it should be considered speculative research, not mainstream quantum mechanics.
 
Last edited:
  • #58
vanhees71 said:
Ok, I'll see that I get this done over the weekend, but I'll not use the Schroedinger picture, because that's very inconvenient in relativistic QFT, but of course, there are only free fields as usual in quantum optics. Then you only need a "wave-packet description" for the photons. The polarizer is described as ideal in terms of a projection operator located at Alice's and Bob's place. Everything works of course in the two-photon Fock space.

If the polarizer is modeled as a projection operator, one already has non-unitary time evolution.
 
Last edited:
  • Like
Likes TrickyDicky
  • #59
atyy said:
Demystifier pointed out that the coarse-graining in Peres's view only acts on the apparatus, not on the quantum system. Hence for the quantum system, the coarse-graining (as far as we can understand Peres) is only equivalent to decoherence.

Decoherence does not do away with the need for collapse, because the {system+apparatus+environment} is still in a pure state, which presumably corresponds to an "ensemble". In order to have sub-ensembles, the individual members of the ensemble must be labelled with different labels (eg. an ensemble of identical balls has no natural sub-ensembles, but a mixture of red and green balls is an ensemble with natural sub-ensembles). If the only label that the individual members of the ensemble have is the pure state, then there are no natural sub-ensembles. If there is a label that is not the pure state, that label is a hidden variable, which is an additional postulate.

Another problem is that decoherence is not perfect. But let's suppose decoherence is perfect, in which case it can be argued that decoherence does pick a preferred basis, and thus picks natural sub-ensembles. But this would mean that an ensemble with no sub-ensembles is suddenly divisible into sub-ensembles at the moment of perfect decoherence. This sudden appearance of sub-ensembles is equivalent to collapse. It is clear that this sudden appearance of sub-ensembles needs an additional postulate.
I don't have access to Peres book so I'm not really commenting on his view. But saying that quantum coarse-graining only acts on the apparatus introduces the quantum-classical distinction as something real when it is just epistemic. See below.
The basic way to see that an additional postulate is needed is that a measurement potentially has two outcomes - a classical reading of the apparatus and a quantum state. If the Born rule has to be specified as a postulate for the classical reading, then it also has to be specified as a postulate for the quantum state. There are of course other postulates that are equivalent, such as noncontextuality for the measurement outcomes, from which the Born rule can be derived via Gleason's theorem. Still the noncontextuality has to be specified as an additional postulate for the classical and quantum outcomes of the measurement.
In any case, I am willing to consider coarse-graining as an additional postulate. However, it is only discussed vaguely in one book, and as far as I can tell, Peres does not specify well enough how the coarse-graining is done that it can replace collapse. So it should be considered speculative research, not mainstream quantum mechanics.
Bohr said that there is no quantum world, just an abstract quantum description, even though there is confusion about what he meant by that I take it to mean that there is no real quantum-classical cut, it is just a graphic way to talk about the measurement problem or the existence of hidden variables but if one insists on taking it literally it can be misleading. Now the minimal interpretation is agnostic about hidden variables(I think the only interpretation that is) and allows to ignore the quantum-classical cut, the wave function is not real in it and the coarse-graining acts on the quantum system as a whole since there is no quantum-classical distinction and it gives freedom to consider the probabilities as subjective(irreversibility or non-unitary evolution), I'd say it is in this sense that the collapse postulate is redundant in this interpretation. But as you say this might not be a strictly mainstream QM take on it since it seems to assume hidden variables rather than being agnostic.
 
  • #60
TrickyDicky said:
Bohr said that there is no quantum world, just an abstract quantum description, even though there is confusion about what he meant by that I take it to mean that there is no real quantum-classical cut, it is just a graphic way to talk about the measurement problem or the existence of hidden variables but if one insists on taking it literally it can be misleading. Now the minimal interpretation is agnostic about hidden variables(I think the only interpretation that is) and allows to ignore the quantum-classical cut, the wave function is not real in it and the coarse-graining acts on the quantum system as a whole since there is no quantum-classical distinction and it gives freedom to consider the probabilities as subjective(irreversibility or non-unitary evolution), I'd say it is in this sense that the collapse postulate is redundant in this interpretation. But as you say this might not be a strictly mainstream QM take on it since it seems to assume hidden variables rather than being agnostic.

My view is that vanhees71 and Ballentine are just wrong (and that the standard textbooks like Landau and Lifshitz and Weinberg are right). (Maybe Peres is also wrong, but it is a bit vague whether he rejects collapse or not, since he seems to use it extensively in http://arxiv.org/abs/quant-ph/9906034. If you don't have his book, you can see http://arxiv.org/abs/quant-ph/9712044 and http://arxiv.org/abs/quant-ph/9906023 for his remarks on coarse graining. Peres seems to accept the classical/quantum cut and the notion that a measurement outcome must be irreversible.) My view is that the minimal interpretation is the "orthodox" Copenhagen interpretation with a classical/quantum cut and and collapse (unless one means that MWI is minimal). Some Ensemble Interpretations such as bhobba's are (AFAICT) correct and equivalent to the "orthodox" Copenhagen interpretation.

In the "orthodox" flavour of Copenhagen, the enigmatic "there is no quantum world" is taken to mean that we are agnostic about whether the wave function is ontic or epistemic. In fact, the traditional wording of the "orthodox" flavour uses both conceptions of the wave functions as conceptual tools. On the one hand, by taking a classical/quantum cut, where the classical world is taken to be absolute reality, while the wave function does not have this status and is taken to be an FAPP tool, the wave function is already "epistemic" or at least "non-ontic" in some sense. The "epistemic" nature of the wave function is especially clear when one considers that in this interpretation, the classical/quantum cut is not absolute and can be shifted. On the other hand, it is acknowledged that we make no mistake in predictions if, having taken the cut, the wave function is taken to be FAPP the complete physical state of an individual system, so the wave function is also taken to be "ontic" in some sense. I like the terms "absolute reality" and "relative reality" that Tsirelson uses in his discussion of the measurement problem http://www.tau.ac.il/~tsirel/download/nonaxio.ps.
 
Last edited:
  • #61
atyy said:
But without collapse, how can measurement be used as a means of quantum state preparation, where we use the classical result obtained to figure out the quantum state of the selected sub-ensemble? (I do understand there is a more general collapse rule than projective measurements, but let's keep things simple here, since there is still collapse in the more general rule.) Does this mean that measurement cannot be used as a form of state preparation in the minimal interpretation?
In classical probability, a probability distribution represents alternate "possibilities". A measurement "actualizes" a sub-ensemble. Would you say there is collapse involved?
The sub-ensemble could still be described with a different probability distribution in which case you could say the measurement "prepared" the new state by selecting a subset of the possibilities. However, if probability distributions are understood as information and not physical, there is absolutely no need to invent a concept of "collapse".
 
  • #62
It seems the collapse and quantum-classical cut you refer to is just the non-commutativity of observables, the quantum hypothesis itself, so it should be present in any interpretation of QM as they all must obey the HUP. A minimum volume is given to the classical phase space that assures irreversibility of measured observables whether one postulates it formally or not the math is there.
 
  • #63
billschnieder said:
In classical probability, a probability distribution represents alternate "possibilities". A measurement "actualizes" a sub-ensemble. Would you say there is collapse involved?
The sub-ensemble could still be described with a different probability distribution in which case you could say the measurement "prepared" the new state by selecting a subset of the possibilities. However, if probability distributions are understood as information and not physical, there is absolutely no need to invent a concept of "collapse".

Yes, if it's just a subjective way to describe our information about the system, then whether you call it a collapse or not, there is nothing very weird about it. But consider an example such as EPR with anti-correlated twin pairs (electron/positron). Alice measures spin-up for the electron, then she knows that Bob will measure spin-down for the corresponding positron. So, she just acquired the information, and there's nothing weird about that. But if it's only a matter of acquiring information, then one would think that Bob's particle was spin-down in the z-direction before Alice's measurement. So the view of "collapse" as being purely information would (it seems to me) to imply pre-existing values for such things as spin, which is basically hidden variables, which is ruled out by Bell's theorem.

The corresponding "collapse" in the classical case really does mean hidden variables. You have two pieces of paper, one white and one black. You put each into an envelope and mix them up, and give one to Alice to open and another to Bob to open. The second that Alice opens her envelope and finds a white piece of paper, she knows that Bob's envelope contains a black piece of paper. In that case, it is completely consistent (and perfectly natural) for Alice to assume that Bob's envelope contained a black piece of paper even before either of them opened their envelopes.
 
  • Like
Likes zonde
  • #64
atyy said:
I like the terms "absolute reality" and "relative reality" that Tsirelson uses in his discussion of the measurement problem
I think the wave function is "possible reality" while experimental results are "actual reality". What selects one of the "possibilities" to be actualized, then is what some people call "collapse", but this is obviously not restricted to quantum mechanics.
 
  • #65
TrickyDicky said:
It seems the collapse and quantum-classical cut you refer to is just the non-commutativity of observables, the quantum hypothesis itself, so it should be present in any interpretation of QM as they all must obey the HUP.

In the minimal interpretation, the notion of measurement is fundamental. A measurement is the interaction between a classical measurement apparatus and a quantum system, resulting in a definite classical outcome. So the classical/quantum cut is fundamental to the minimal interpretation, unless the notion of measurement can be removed as fundamental in quantum mechanics.

TrickyDicky said:
A minimum volume is given to the classical phase space that assures irreversibility of measured observables whether one postulates it formally or not the math is there.

There is no minimum volume or even a classical probability distribution over the classical phase space in quantum mechanics. It is true that some books such as Reif use this concept, but although it is useful, I don't think (maybe I'm wrong) it is fundamental, more a very good heuristic like wave-particle duality.
 
  • #66
stevendaryl said:
The corresponding "collapse" in the classical case really does mean hidden variables. You have two pieces of paper, one white and one black. You put each into an envelope and mix them up, and give one to Alice to open and another to Bob to open. The second that Alice opens her envelope and finds a white piece of paper, she knows that Bob's envelope contains a black piece of paper. In that case, it is completely consistent (and perfectly natural) for Alice to assume that Bob's envelope contained a black piece of paper even before either of them opened their envelopes.
It is not as easy as you make it sound. If we move away from the Bertlmann's socks type variables of "paper color" to the more relevant space-type dynamically changing variables then it is impossible for Alice to say by opening her envelope, what Bob will see. For example, let the papers by dynamically changing colors between black and white (opposite sequence between the two) at a given hidden frequency such that once you open the envelope, the exposure to light stops the dynamics instantly and locks it to one color. Your claim that Alice will know the result of Bob's paper by opening her envelope is false. However, we know for a fact that the two pieces of paper have perfectly anti-correlated colors. Often when we discuss these things, we easily gloss over the fact that the discussion requires that Alice and Bob opened their envelopes at the exact same time. But it is impossible to test this experimentally without post-selection. You have to take time-tags on both sides and use the time at one end to filter the results at the other end. Only then will the experiment match what is actually happening. But post-selection invalidates the derivation of the Bell inequalities since the joint probability distribution for post-selected experiments is non-factorable. Not surprising that all experiments to date, claiming violation of inequalities, employ one form of post-processing or another.
 
  • #67
@billschneider, please discuss your postselection issue by starting another thread, not here.
 
  • #68
billschnieder said:
However, if probability distributions are understood as information and not physical, there is absolutely no need to invent a concept of "collapse".
And if it is understood as formal mathematics measure (http://en.wikipedia.org/wiki/Measure_(mathematics)) based on an independent axiomatic of any application, leaving aside any semantic notion (just a formal writing game) ?

Patrick
 
  • #69
atyy said:
In the minimal interpretation, the notion of measurement is fundamental. A measurement is the interaction between a classical measurement apparatus and a quantum system, resulting in a definite classical outcome. So the classical/quantum cut is fundamental to the minimal interpretation, unless the notion of measurement can be removed as fundamental in quantum mechanics.
There is no minimum volume or even a classical probability distribution over the classical phase space in quantum mechanics. It is true that some books such as Reif use this concept, but although it is useful, I don't think (maybe I'm wrong) it is fundamental, more a very good heuristic like wave-particle duality.
Measurement is fundamental to any empirical science, not specifically to the minimal
interpretation of QM, if you define it as interaction classical apparatus/quantum system you are already introducing a specific heuristic or interpretation as fundamental when the measurement problem is basically the lack of consensus about what measurent in QM entails.
Sometimes I think it would be more productive to turn to the Schrodinger for puzzlement instead of being surprised by measuring classical observables and getting classical outcomes
 
  • #70
TrickyDicky said:
Measurement is fundamental to any empirical science, not specifically to the minimal
interpretation of QM, if you define it as interaction classical apparatus/quantum system you are already introducing a specific heuristic or interpretation as fundamental when the measurement problem is basically the lack of consensus about what measurent in QM entails.

In classical physics (Newtonian physics, special and general relativity), measurement is not a fundamental concept. Historically, Einstein did postulate measurement as fundamental in special relativity: the speed of light measured by any inertial observer is the same. However, we have removed that, and nowadays we say that special relativity means the laws have Poincare symmetry. Historically, measurement was also important in the genesis of general relativty: test particles follow geodesics. The test particle is a sort of measurement apparatus that is apart from the laws of physics because it does not cause spacetime curvature, in contrast to all other forms of matter. However, in the full formulation of general relativity, test particles are not fundamental. So quantum mechanics is different from classical physics in needing to specify measurement as a fundamental concept.

I am using a particular interpretation to define QM, but it is the minimal interpretation. The measurement problem is that we have to put this classical/quantum cut to define the minimal interpretation. The other interpretations are then approaches to solving the measurement problem by removing the need for measurement to be a fundamental concept in the mathematical specification of a theory. Examples of such interpretations are consistent histories (flavour of Copenhagen), hidden variables (generally predicting deviations from QM), or Many-Worlds.

TrickyDicky said:
Sometimes I think it would be more productive to turn to the Schrodinger for puzzlement instead of being surprised by measuring classical observables and getting classical outcomes

Another way of stating the measurement problem, is that if everything is quantum and we have a wave function of the universe, how can we make sense of such an idea? The minimal interpretation cannot make sense of such an idea, and always needs a classical/quantum cut. Bohmian Mechanics and Many-Worlds are two approaches to solving the measurement problem, in which the wave function of the universe is proposed to make sense.
 

Similar threads

Replies
1
Views
1K
Replies
13
Views
1K
Replies
4
Views
2K
Replies
7
Views
1K
Replies
2
Views
1K
Back
Top