Can the Born Rule Be Derived in the Many Worlds Interpretation?

In summary, Sean Carroll has written a paper explaining how it is possible to derive the Born Rule in the Many Worlds Interpretation of Quantum Mechanics. I'm not sure that it's the final word on the subject, but it does grapple with the big question of how it makes sense to use probabilities to describe a universe that evolves deterministically.
  • #106
@kith and @Fredrik, a couple more thoughts on collapse in the Ensemble Interpretation. Above I outlined two ways that collapse may be avoided.

1) Couple to ancillae, and push all measurements to the end of time. However, this means that successive measurements are not allowed, which is fine. However, Ballentine does describe successive measurements. He even says that a filtering measurement is a method of state preparation, which implies successive measurements since one would presumably make a measurement on the state prepared by the previous measurement.

2) Use the Stinespring theorem, which shows that every completely positive map, including collapse, can be described as unitary evolution on a larger Hilbert space. I don't think this eliminates the need for a postulate equivalent to collapse, because we know that this map occurs only with some probability, given by the Born rule. Although the map can be described as unitary evolution, I don't see how its probability is obtained without the form of the Born rule that includes collapse (unless Many-Worlds works, but that's not Ballentine's Ensemble Interpretation).
 
Last edited:
Physics news on Phys.org
  • #107
atyy said:
@kith and @Fredrik, a couple more thoughts on collapse in the Ensemble interpretation. Above I outlined two ways that collapse may be avoided.

1) Couple to ancillae, and push all measurements to the end of time. However, this means that successive measurements are not allowed, which is fine. However, Ballentine does describe successive measurements. He even says that a filtering measurement is a method of state preparation, which implies successive measurements since one would presumably make a measurement on the state prepared by the previous measurement.

2) Use the Stinespring theorem, which shows that every completely positive map, including collapse, can be described as unitary evolution on a larger Hilbert space. I don't think this eliminates the need for collapse, because we know that this map occurs only with some probability, given by the Born rule. Although the map can be described as unitary evolution, I don't see how its probability is obtained without the form of the Born rule that includes collapse (unless many-worlds works, but that's not Ballentine's ensemble interpretation).
Exactly. This is exactly Belavkin's point: these mathematical tricks don't solve the Schrödinger cat logical conundrum. Taking Born's law and the Heisenberg cut seriously, does solve it. But so far, for most practical minded physicists, there was not an issue, since they know how to get the right answer and they don't mind if there are some logical issues with the framework they are using. "it's not their problem". This is called by Bell the FAPP trap (FAPP = for all practical purposes). Many other practical minded physicists thinks that everything is solved by the MWI (many worlds interpretation). Personally I see that as more of a smokescreen than a solution, but that is just my hang-up, being a mathematician and not a practical minded physicist.
 
  • #108
Again, I'm to naive to understand the necessity of a collapse at all! Take a "classical" situation of througing a die. Without further knowledge about its properties, I use the maximum-entropy method to associate a probability distribution. Of course, the least-prejudice distribution in the Shannon-Jaynes information-theoretical sense is that the occurance of any certain value is [itex]p_k=1/6[/itex], [itex]k \in \{1,2,3,4,5,6 \}[/itex].

Now I through the die once and get "3". Is now my probability distribution collapsed somehow to [itex]p_3=1[/itex] and all other [itex]p_k=0[/itex]? I don't think that anybody would argue in that way.

Now, of course my probability distribution is assumed on the uncomplete knowledge about the die. In this case I assumed, I don't know anything about it and used the maximum-entropy principle to make a prediction of the probability distribution based on the objective principle of "least prejudice" (which of course has a certain well-defined meaning). Now, how to check this prediction? Of course, I've to through the die many times indepenently and always under the same conditions and make the statistics, how often the outcomes occur. Then I may come to the conclusion that the die is somehow biased towards a different distribution, and henceforth I use a new probability distribution based on the gained (statistical) knowledge about the die. Would you now say, somehow the probability distribution is collapsed to the new one, and that this is a physical process involving the die? I'd not say that anybody would argue in that way.

That's done only in quantum theory. There you also have a very precise defintion of how to associate probability distributions to the outcome of measurents, given a certain (equivalence class of) preparation procedures of the (quantum) system under investigation. E.g., if you prepare a complete set of observables to have determined values, you have prepared the system in a unique pure state (represented by a ray in Hilbert space). Now I perform a measurement of any other observable. In the following let's assume that we perform an ideal von Neumann filter measurement.

Is now some collapse occurring in the sense of some flavors of the Copenhagen interpretation? If so, when does it occur?

Note that there are two possible scenarios: (a) I measure the quantity and filter out all systems having a certain given value of this observable. According to standard quantum theory, such a system, I have to describe by a pure state, corresponding to the very eigenvector which is given by the projection of the original state onto the eigenspace of the measured eigenvalue. Has now the state collapsed to the new pure state and, if yes, when and how does this collapse occur?

(b) I measure the quantity in the sense of an ideal filter measurement but do not take notice of the measured outcome. Then, again according to standard quantum theory in the minimal interpretation, I associate a mixed state with the then newly prepared system. Has now the state collapsed to the new mixed state?

I'd answer "no" to all these collapse questions. For me the association of states is an objective association of mathematical objects like state kets or statstistical operators based on a known preparation procedure of the quantum system. This gives me certain probabilities to find a certain value of any observable of the system, implying that the value of this observable is indetermined if the system is not in an eigenstate of the self-adjoint operator assiciated with this observable. The measurement is simply an interaction with a measurement apparatus which somehow stores the value of the measured observable to read off this value. Nothing else happens than this physical interactions, and nothing collapses in the Copenhagen sense as a physical process in nature. It may be that the system can be followed further after the measurement (e.g., when an ideal filter measurement has been performed), and the interaction with the measurement apparatus (and probably an associated filter proceess has been done) can be seen as a new preparation of the system, which I can describe further quantum mechanically in terms (of a pure or mixed) state. In this latter case, there is no collapse either, but just the application of well-defined rules of how to describe our knowledge about the system, which can be checked experimentally. That's all what physics is about: A description of our possible objective knowledge about physical systems. No more and no less. Any further questions some philosophers are after like an "ontology of quantum systems" is metaphysics and not my business as a physicist. This I leave to the philosophers!

More often you even destroy the system you measure, e.g., using photons as quantum systems usually they get absorbed finally by the detector. Are you the saying there must occur a collapse, although there's not even the photon left to be described in any way? I don't think so. I also leave it to philosophers to determine, what is the "ontology of photons" beyond the description we use as physicists in terms of QED.
 
  • #109
vanhees71 said:
Again, I'm to naive to understand the necessity of a collapse at all!

Same here.

There is no collapse axiom in QM.

The 2 axioms in Ballentine say nothing about it at all.

Its an invention of some interpretations.

In the ignorance ensemble interpretation one simply takes an improper mixture to be a proper one. No collapse involved. It has problems, as all interpretations do, but collapse aren't one of them.

I think using Gleason's Theorem clarifies this quite a bit:
https://www.physicsforums.com/showthread.php?t=758125

A state is simply a logical consequence of mapping outcomes to POVM's - its simply an aid in calculating an outcomes probabilities.

Thanks
Bill
 
Last edited:
  • #110
When you didn't destroy the system you were measuring, then if you measure it the von Neumann way, then people usually understand that the particles which are associated with measurement outcome "a" are now in the pure state with state vector = corresponding eigenvector. That is what people call "collapse". What's in a name? The Born law tells us which proportion of particles correspond to which particular eigenvalue. The collapse postulate says that this is in effect a way to "prepare" particles in particular pure states.

Sure, you can leave to the philosophers to puzzle about what might actually be going on. The working physicist just needs a way to translate back and forth between things he or she can do in the lab and pieces of mathematical machinery. One of those pieces has nowadays got the name "collapse". "A rose by any other name would smell as sweet". You can call it "preparation by measurement" if you prefer.

Does Ballentine tell you how to prepare a quantum system in the pure state corresponding to the eigenstate of some observable? If so, he has a collapse postulate but he gives it another name, so as not to upset people's feelings.

The modern student of quantum information theory certainly does need to know this notion. Read Nielsen and Chuang. And most everything in there has been implemented in the lab, too. It's not just theory.

Send polarized photons through a piece of polarized glass so that all the ones which get through are polarized in the same direction. What do you call that? You can say "the polarization of each photon was measured in a particular direction. Those for which the outcome was "I'm polarized that way" are allowed through. Those for which the outcome was "I have the perpendicular direction of polarization" get absorbed.
 
Last edited:
  • #111
bhobba said:
In the ignorance ensemble interpretation one simply takes an improper mixture to be a proper one. No collapse involved. It has problems, as all interpretations do, but collapse aren't one of them.

That's fine. What I am saying is that if collapse is removed as a postulate, then the presentation of quantum mechanics is incomplete unless it is replaced by another postulate.

Ballentine fails to state that an improper mixture can be treated as a proper one, so he is missing a postulate.

So here you are agreeing with me, not vanhees71.
 
  • #112
gill1109 said:
The collapse postulate says that this is in effect a way to "prepare" particles in particular pure states.

The issue is its not a postulate.

The state is simply an aid in calculating the probabilities of an outcome of an observation and these days is associated with a preparation procedure. Different state - different preparations. After an observation that didn't destroy the object its now prepared differently - obviously it will have a different state.

Thanks
Bill
 
  • #113
bhobba said:
The issue is its not a postulate.

The state is simply an aid in calculating the probabilities of an outcome of an observation and these days is associated with a preparation procedure. Different state - different preparations. After an observation that didn't destroy the object its now prepared differently - obviously it will have a different state.

Thanks
Bill

The question is: can you derive that an improper mixture is equivalent to a proper mixture?

In a selective measurement, the state |ψ> collapses to some state |j>. This is a completely-positive map, so by Stinespring's theorem you can think of it as unitary evolution on a larger Hilbert space. However, Stinespring's theorem does not tell you that this collapse or unitary evolution is probabilistic, but in fact it is. And it occurs with a probability given by the Born rule with collapse.
 
  • #114
gill1109 said:
Does Ballentine tell you how to prepare a quantum system in the pure state corresponding to the eigenstate of some observable? If so, he has a collapse postulate but he gives it another name, so as not to upset people's feelings..

He doesn't nor does he have to.

He simply associates a state with a preparation procedure.

If you have a filtering type observation all you have done is prepared the system differently so of course it is in a different state.

However that's Ballentine.

I go further and base QM on one axiom:
An observation/measurement with possible outcomes i = 1, 2, 3 ... is described by a POVM Ei such that the probability of outcome i is determined by Ei, and only by Ei, in particular it does not depend on what POVM it is part of.

A state is not even mentioned. Then Gleason is applied to show a positive operator of unit trace P exists such that the probability is Trace (P Ei). Its simply an aid to calculation in predicting the probability of outcomes. What possible difference could such changing when a system is prepared differently make?

Thanks
Bill
 
  • #115
atyy said:
Yes, I am saying that the minimal ensemble/statistical/Copenhagen interpretation must explicitly or implicitly include collapse or an equivalent axiom in order be considered correct quantum mechanics. Actually, Copenhagen explicitly includes collapse, which is the Born rule in the form that the probability to observe a state |a> given that the system is in state |ψ> is |<a|ψ>|2.

The Ensemble interpretation without collapse usually says that the probability to observe the eigenvalue corresponding to state |a> given that the system is in state |ψ> is |<a|ψ>|2. Thus this form of the Born rule without collapse doesn't give you the probability of the sub-ensembles that are formed. Yet we know that the probability of obtaining sub-ensemble |a> after a measurement is |<a|ψ>|2. The Born rule without collapse is unable to make this prediction.
Without a way to map preparation procedures to the mathematical things that are supposed to represent them in the theory, we don't have a theory. So yes, we need something like the rule that says that after a non-destructive measurement of an observable ##A## with non-degenerate result ##a##, the state is ##|a\rangle##.

I just don't approve of the term "collapse". That's a term invented by people who were thinking about pure states as representing what you called the system's "real state". It strongly suggests that we're talking about some kind of physical process that isn't part of standard QM, and that changes the "real state" from an arbitrary state vector to an eigenvector.

People who don't think about pure states this way find this pretty absurd. But I think they too sometimes use terminology that's too suggestive. For example, someone said that collapse is just a selection of a subensemble on which to perform measurements. (For example, split a beam in two with a Stern-Gerlach magnet, and use only one beam in the experiment). This makes even less sense to me, unless we assume that every particle has a well-defined position at all times, regardless of its wavefunction.

I prefer to be more neutral about what's actually happening, and just point out that we're only talking about a mapping between preparation procedures and states. If we view the rule as only approximate, then it's perfectly consistent with the rest of QM (because of decoherence).
 
  • #116
Fredrik said:
Without a way to map preparation procedures to the mathematical things that are supposed to represent them in the theory, we don't have a theory. So yes, we need something like the rule that says that after a non-destructive measurement of an observable ##A## with non-degenerate result ##a##, the state is ##|a\rangle##.

It follows from continuity and the Born rule - it's not a separate axiom.

All a change in it means is the system was prepared differently.

Thanks
Bill
 
  • #117
bhobba said:
It follows from continuity and the Born rule - it's not a separate axiom.

All a change in it means is the system was prepared differently.

Thanks
Bill
Please tell me your two rules, Bill. I am not familiar with Ballentine. Obviously I should be ... but sorry. I learned some kind of QM from many sources including Nielsen and Chuang. If your two Ballentine axioms allow me to build a quantum computer, then they are enough, and they include a "hidden" version of "collapse". It's just not named as such. As a mathematician, I couldn't care less what you name and what you don't name and what names you use. Presumably the theory is the same and the mapping between theory and the lab is the same ... unless Ballentine is actually incomplete and actually needs another axiom, but didn't realize it.

You can obviously make "collapse" redundant as long as you have "Born law for measurement" and you allow "preparation of a particle in an eigenstate". "Collapse" is just those two operations strung together, and not taking any notice of the measurement outcome. If we don't care about what is actually going on (that's for philosophers or even according to some philosophies - eg Bohr's - a total waste of time) then it doesn't matter what names we give to things and what basic operations we put into the mathematical formalism as long as we can generate all the things which we actually meet in the lab by putting those operations together in any way we like.
 
Last edited:
  • #118
bhobba said:
It follows from continuity and the Born rule - it's not a separate axiom.

All a change in it means is the system was prepared differently.

Thanks
Bill

The point is you need an additional assumption. Here by continuity, you mean "for projective measurements, immediate repetition of the measurement yields the same result". So the basic point is that Ballentine is missing a postulate whether it is collapse, or equivalence of proper and improper mixtures, or immediate repetition of a projective measurement gives the same result. It's not that I want to nitpick Ballentine, except that he clearly ridicules collapse in his book, and makes it seem like he can do away with it in his Ensemble Interpretation without adding an equivalent postulate.
 
  • #119
Fredrik said:
Without a way to map preparation procedures to the mathematical things that are supposed to represent them in the theory, we don't have a theory. So yes, we need something like the rule that says that after a non-destructive measurement of an observable ##A## with non-degenerate result ##a##, the state is ##|a\rangle##.

I just don't approve of the term "collapse". That's a term invented by people who were thinking about pure states as representing what you called the system's "real state". It strongly suggests that we're talking about some kind of physical process that isn't part of standard QM, and that changes the "real state" from an arbitrary state vector to an eigenvector.

People who don't think about pure states this way find this pretty absurd. But I think they too sometimes use terminology that's too suggestive. For example, someone said that collapse is just a selection of a subensemble on which to perform measurements. (For example, split a beam in two with a Stern-Gerlach magnet, and use only one beam in the experiment). This makes even less sense to me, unless we assume that every particle has a well-defined position at all times, regardless of its wavefunction.

I prefer to be more neutral about what's actually happening, and just point out that we're only talking about a mapping between preparation procedures and states. If we view the rule as only approximate, then it's perfectly consistent with the rest of QM (because of decoherence).

OK, I think we basically agree. I don't really care what it's named, some people prefer "state reduction". And yes, once we take this Heisenberg cut in the Copenhagen interpretation, we are agnostic as to whether the quantum state is a "real state".

The only quibble is that I think the collapse postulated is as exact as the Born rule itself. But this is quibbling, since the ingredients that go into the Born rule like the Heisenberg cut, the notion of measuring device, and the notion of a definite outcome are fuzzy.
 
  • #120
atyy said:
Collapse is only needed for successive measurements (successive irreversible marks on the detector on the macroscopic side of the cut).
If we can put the cut wherever we want to, why not include the detectors on the microscopic side? This would mean that the need for collapse depends on the description, so collapse would not be a fundamental notion but a tool we use in the calculations.

atyy said:
So if we can push the time that we see the irreversible mark back so that there is only one measurement, there is no need to have collapse.
Now this sounds more like the crucial difference is the observer. If the observer sets up two spin measurement apparatuses and just looks at the final result, no collapse is needed. If he checks the intermediate result, collapse is needed. Do I get you right here?
 
Last edited:
  • #122
Fredrik said:
Without a way to map preparation procedures to the mathematical things that are supposed to represent them in the theory, we don't have a theory. So yes, we need something like the rule that says that after a non-destructive measurement of an observable ##A## with non-degenerate result ##a##, the state is ##|a\rangle##.
What I really can't tell from reading your posts: Do you think this is something weird i.e. weirder than the situation in classical mechanics?
 
  • #123
kith said:
What I really can't tell from reading your posts: Do you think this is something weird i.e. weirder than the situation in classical mechanics?
I do. The reason I'm a bit against "interpretations" is that don't think an explanation for that weirdness is hidden inside QM. The best case scenario is that it can be explained by a new theory. The worst case scenario is that it's beyond the reach of scientific methods, and therefore fundamentally unknowable.

Edit: I think I didn't read your question carefully enough. I read it as "do you think there's something weird in QM?", but you're quoting me and using the word "this", so you were asking specifically if I think that it's weird that we can prepare pure states through non-destructive measurements. That's a bit harder to answer, because now I have the option to focus on the decoherence argument that sort of explains it. But this argument relies on QM, so it takes us back to the fundamental mystery of "what's really happening to the system?".
 
Last edited:
  • #124
Fredrik said:
I read it as "do you think there's something weird in QM?", but you're quoting me and using the word "this", so you were asking specifically if I think that it's weird that we can prepare pure states through non-destructive measurements. That's a bit harder to answer, because now I have the option to focus on the decoherence argument that sort of explains it. But this argument relies on QM, so it takes us back to the fundamental mystery of "what's really happening to the system?".
This was kind of my point. You said that it is essential for a theory to relate preparation procedures with elements of the theory. So this can't really be the weird thing. Another candidate for the weirdness is that the relation of preparation procedures and elements of the theory seems to contradict the time evolution law of the theory. But if we include all relevant parts of the experiment in the quantum description we aren't led to such a contradiction.

So I am inclined to agree with your last sentence that the basic weirdness is "what's really happening to the system?". But does a scientific theory have foundational issues if it doesn't answer this question? After all, I wouldn't call this a scientific question. And QM even answers the question what we get if we apply a simple realistic interpretation: Many Worlds. (I am a bit unsure about whether a full-blown MWI has fundamental issues or not, however)
 
Last edited:
  • #125
kith said:
If we can put the cut wherever we want to, why not include the detectors on the microscopic side? This would mean that the need for collapse depends on the description, so collapse would not be a fundamental notion but a tool we use in the calculations.

What I'm saying is that the state is just a tool we use in our calculations, and collapse is as fundamental as the state as a tool.

kith said:
Now this sounds more like the crucial difference is the observer. If the observer sets up two spin measurement apparatuses and just looks at the final result, no collapse is needed. If he checks the intermediate result, collapse is needed. Do I get you right here?

Yes.
 
  • #126
atyy said:
What I'm saying is that the state is just a tool we use in our calculations, and collapse is as fundamental as the state as a tool.
The state is present in all descriptions, collapse not.

atyy said:
Yes.
Ballentine comments on this in chapter 9.3, where he critisizes different notions of collapse. He writes
"(ii) The observer causes the reduction process (9.9) when he reads the result of the measurement from the apparatus.
This is really just a variant of (i) with the observer, rather than the apparatus, causing the disturbance, and it is refuted simply by redefining (II) [the measurement apparatus] to include both the apparatus and the observer."

/edit: just to state this clearly - your notion of collapse implies that consciousness causes collapse?
 
Last edited:
  • #127
vanhees71 said:
Again, I'm to naive to understand the necessity of a collapse at all! Take a "classical" situation of througing a die. Without further knowledge about its properties, I use the maximum-entropy method to associate a probability distribution. Of course, the least-prejudice distribution in the Shannon-Jaynes information-theoretical sense is that the occurance of any certain value is [itex]p_k=1/6[/itex], [itex]k \in \{1,2,3,4,5,6 \}[/itex].

Now I through the die once and get "3". Is now my probability distribution collapsed somehow to [itex]p_3=1[/itex] and all other [itex]p_k=0[/itex]? I don't think that anybody would argue in that way.

Now, of course my probability distribution is assumed on the uncomplete knowledge about the die. In this case I assumed, I don't know anything about it and used the maximum-entropy principle to make a prediction of the probability distribution based on the objective principle of "least prejudice" (which of course has a certain well-defined meaning). Now, how to check this prediction? Of course, I've to through the die many times indepenently and always under the same conditions and make the statistics, how often the outcomes occur. Then I may come to the conclusion that the die is somehow biased towards a different distribution, and henceforth I use a new probability distribution based on the gained (statistical) knowledge about the die. Would you now say, somehow the probability distribution is collapsed to the new one, and that this is a physical process involving the die? I'd not say that anybody would argue in that way.

Sure, it has long been considered attractive to try to interpret collapse in this way. Cohen-Tanoudji, Diu and Laloe introduce collapse partly in this way.

vanhees71 said:
That's done only in quantum theory. There you also have a very precise defintion of how to associate probability distributions to the outcome of measurents, given a certain (equivalence class of) preparation procedures of the (quantum) system under investigation. E.g., if you prepare a complete set of observables to have determined values, you have prepared the system in a unique pure state (represented by a ray in Hilbert space). Now I perform a measurement of any other observable. In the following let's assume that we perform an ideal von Neumann filter measurement.

Is now some collapse occurring in the sense of some flavors of the Copenhagen interpretation? If so, when does it occur?

Note that there are two possible scenarios: (a) I measure the quantity and filter out all systems having a certain given value of this observable. According to standard quantum theory, such a system, I have to describe by a pure state, corresponding to the very eigenvector which is given by the projection of the original state onto the eigenspace of the measured eigenvalue. Has now the state collapsed to the new pure state and, if yes, when and how does this collapse occur?

Collapse occurs when you make a measurement. The problem does not enter with collapse, it already enters with the notion of measurement, because we can ask: when does a measurement occur?

vanhees71 said:
(b) I measure the quantity in the sense of an ideal filter measurement but do not take notice of the measured outcome. Then, again according to standard quantum theory in the minimal interpretation, I associate a mixed state with the then newly prepared system. Has now the state collapsed to the new mixed state?

I'd answer "no" to all these collapse questions. For me the association of states is an objective association of mathematical objects like state kets or statstistical operators based on a known preparation procedure of the quantum system. This gives me certain probabilities to find a certain value of any observable of the system, implying that the value of this observable is indetermined if the system is not in an eigenstate of the self-adjoint operator assiciated with this observable. The measurement is simply an interaction with a measurement apparatus which somehow stores the value of the measured observable to read off this value. Nothing else happens than this physical interactions, and nothing collapses in the Copenhagen sense as a physical process in nature. It may be that the system can be followed further after the measurement (e.g., when an ideal filter measurement has been performed), and the interaction with the measurement apparatus (and probably an associated filter proceess has been done) can be seen as a new preparation of the system, which I can describe further quantum mechanically in terms (of a pure or mixed) state. In this latter case, there is no collapse either, but just the application of well-defined rules of how to describe our knowledge about the system, which can be checked experimentally. That's all what physics is about: A description of our possible objective knowledge about physical systems. No more and no less. Any further questions some philosophers are after like an "ontology of quantum systems" is metaphysics and not my business as a physicist. This I leave to the philosophers!

Yes, for a non-selective measurement in which you do not take note of the outcome, the Born rule with collapse is not required. It can be modeled by deterministic unitary time-evolution and decoherence. For selective measurements, the time evolution is probabilistic, and collapse must be invoked if a sequence of measurements is described.

vanhees71 said:
More often you even destroy the system you measure, e.g., using photons as quantum systems usually they get absorbed finally by the detector. Are you the saying there must occur a collapse, although there's not even the photon left to be described in any way? I don't think so. I also leave it to philosophers to determine, what is the "ontology of photons" beyond the description we use as physicists in terms of QED.

The generalization of projective measurements, including collapse, is the rule for POVMs. This rule, which includes collapse, correctly gives the lack of a photon after a measurement, because the system is left in the ground state of the photon field of QED.
 
Last edited:
  • #128
kith said:
The state is present in all descriptions, collapse not.

Yes, in the sense that if successive measurements are not performed, collapse is not needed. So I agree collapse is not needed if we never describe successive measurements, and we do not predict that a projective measurement can be used to prepare a state. Ballentine, unfortunately, says that filtering measurements can be used to prepare a state, so he is wrong to reject collapse without replacing it with an equivalent postulate.

kith said:
Ballentine comments on this in chapter 9.3, where he critisizes different notions of collapse. He writes
"(ii) The observer causes the reduction process (9.9) when he reads the result of the measurement from the apparatus.
This is really just a variant of (i) with the observer, rather than the apparatus, causing the disturbance, and it is refuted simply by redefining (II) [the measurement apparatus] to include both the apparatus and the observer."

/edit: just to state this clearly - your notion of collapse implies that consciousness causes collapse?

I wouldn't object to that. If we follow say, Landau and Lifshitz, then measurement - the extraction of a definite outcome by the macroscopic appratus on one side of the Heisenberg cut from the quantum system on the other side of the cut - is what causes collapse. Since I don't know whether a non-conscious observer will draw a Heisenberg cut, it can be argued that consciousness is indirectly responsible for collapse.

To test this, one would have to build a robot that can do quantum mechanics, and then decide whether that robot is conscious:)
 
  • #129
kith said:
So I am inclined to agree with your last sentence that the basic weirdness is "what's really happening to the system?". But does a scientific theory have foundational issues if it doesn't answer this question?
In my opinion no. A set of statements about the real world has to falsifiable in order to be considered a theory, but I think that's it. It doesn't have to describe what's happening between state preparation and measurement. It's great if it does, but OK if it doesn't.
 
  • #130
atyy said:
Yes, in the sense that if successive measurements are not performed, collapse is not needed.
But whether successive measurements are performed or not depends on the description.

Let's assume we have an observer A who puts the Heisenberg cut between a spin system and two SG apparatuses. He performs successive measurements and needs collapse. We as observers B use a different cut which includes the system, the apparatuses and the observer in the quantum domain. Our description doesn't need collapse because we don't perform successive measurements. Both descriptions refer to the same physical situation. One description needs collapse and one doesn't. In this case, collapse wouldn't be a fundamental notion.
 
  • Like
Likes eloheim
  • #131
kith said:
But whether successive measurements are performed or not depends on the description.

Let's assume we have an observer A who puts the Heisenberg cut between a spin system and two SG apparatuses. He performs successive measurements and needs collapse. We as observers B use a different cut which includes the system, the apparatuses and the observer in the quantum domain. Our description doesn't need collapse because we don't perform successive measurements. Both descriptions refer to the same physical situation. One description needs collapse and one doesn't. In this case, collapse wouldn't be a fundamental notion.

Sure, if one agrees never to describe successive measurements, and if one agrees never to use measurement as a means of state preparation, then collapse is not needed as a fundamental notion.
 
  • #132
atyy said:
Sure, if one agrees never to describe successive measurements, and if one agrees never to use measurement as a means of state preparation, then collapse is not needed as a fundamental notion.
My point was that all successive measurements can be described in terms of a single measurement. I don't actually have to use such a description.
 
  • #133
kith said:
My point was that all successive measurements can be described in terms of a single measurement. I don't actually have to use such a description.

Sure. So I should say that if one wants to describe successive measurements, and use measurement as a means of state preparation, as Ballentine does, then collapse is needed as a fundamental postulate. Of course this is subjective, since quantum mechanics is subjective. It's just a tool for the observer to use.

An analogy is equilibrium statistical mechanics. The notion of equilibrium is subjective, since we believe the universe is expanding. If one does not observe equilibrium, then there is no need for equilibrium statistical mechanics.

To be a little speculative, let me suggest that the analogy with equilibrium statistical mechanics can be seen in both Bohmian mechanics and Many-Worlds. Bohmian mechanics depends on the concept of a quantum equilibrium, which presumably is subjective in the same way that equilibrium in classical statistical mechanics is. In Many-Worlds (let's assume it works just for discussion), collapse is transformed into branching which occurs when decoherence occurs. Since decoherence is never perfect, there must be an element of subjectivity as to when branching occurs.
 
Last edited:
  • #134
atyy said:
Sure. So I should say that if one wants to describe successive measurements, and use measurement as a means of state preparation, as Ballentine does, then collapse is needed as a fundamental postulate.
I still don't agree with the fundamental postulate part. If I can get rid of something by using an equally valid description, I don't consider this something fundamental to the theory. We probably won't reach an agreement here.
 
  • #135
atyy said:
Since decoherence is never perfect, there must be an element of subjectivity as to when branching occurs.
I think the very notion of a branch depends on the decomposition of the universal Hilbert space which in turn depends on the observer's definition of systems. Maybe the only way to avoid this subjectivity is to take every one dimensional subspace as a world (as Fredrik suggested in an older thread).
 
  • #136
kith said:
I still don't agree with the fundamental postulate part. If I can get rid of something by using an equally valid description, I don't consider this something fundamental to the theory. We probably won't reach an agreement here.

That's fine. I don't think this is a technical disagreement. There's a similar problem with the Bell tests. When does Bob consider Alice's measurement to be complete? Common sense will say that Bob considers Alice's measurement complete when it was done at a spacelike separation. However, the result was not definite for Bob at a spacelike separation. So can Bob consider that Alice, and her record of the result became definite for him when she showed him the results at non-spacelike separation? It seems yes, he could say that. In which case there are no Bell tests. Basically, Bob can choose his Heisenberg cut so that Alice is quantum until she shows him the result.

Apparently this loophole was known to Einstein. In Wiseman's words http://arxiv.org/abs/quant-ph/0509061:
"None of this appears much different from Einstein’s 1935 arguments. But here for the first time he also stated: [3] (p. 85)

One can escape from this conclusion [that statistical quantum theory is incomplete] only by either assuming that the measurement of S1 (telepathically) changes the real situation of S2 or by denying independent real situations as such to things which are spatially separated from each other. Both alternatives appear to me equally unacceptable.

Omitting the opinion (clearly stated as such) about what was acceptable in a physical theory, the logical deduction to which Einstein came in 1946 was that one of the following is false:
(i) the completeness of statistical QM
(ii) locality (that is, the postulates of relativity)
(iii) the independent reality of distant things."
 
Last edited:
  • #137
gill1109 said:
Please tell me your two rules, Bill. I am not familiar with Ballentine. Obviously I should be ... but sorry.

No need to be sorry :thumbs::thumbs::thumbs::thumbs:

It is however the BEST book on QM I know fixing up many issues and misconceptions and is very well thought of by many that post here - not Atty though - he has issues with it - but its best if he explains them.

Since your background is math, and mine is as well, I will build up to the two axioms in a slightly different way than Ballentine does.

First we need to define a Positive Operator Value Measure (POVM). A POVM is a set of positive operators Ei ∑ Ei =1 from, for the purposes of QM, an assumed complex vector space.

Elements of POVM's are called effects and its easy to see a positive operator E is an effect iff Trace(E) <= 1.

Now we can state the single foundational axiom QM is based on in the way I look at it which is a bit different than Ballentine who simply states the axioms without a discussion of why they are true - it's interesting it can be reduced to basically just one. Of course there is more to QM than just one axiom - but the rest follow in a natural way.

An observation/measurement with possible outcomes i = 1, 2, 3 ... is described by a POVM Ei such that the probability of outcome i is determined by Ei, and only by Ei, in particular it does not depend on what POVM it is part of.

Now I will evoke a very beautiful theorem which is a modern version of a famous theorem you may have heard of called Gleason's, and will in fact prove it.

Only by Ei means regardless of what POVM the Ei belongs to the probability is the same. This is the assumption of non contextuality and is the well known rock bottom essence of Born's rule via Gleason. The other assumption, not explicitly stated, but used, is the strong law of superposition ie in principle any POVM corresponds to an observation/measurement.

I will let f(Ei) be the probability of Ei. Obviously f(I) = 1 since the POVM contains only one element. Since I + 0 = I f(0) = 0.

First additivity of the measure for effects.

Let E1 + E2 = E3 where E1, E2 and E3 are all effects. Then there exists an effect E, E1 + E2 + E = E3 + E = I. Hence f(E1) + f(E2) = f(E3)

Next linearity wrt the rationals - its the usual standard argument from additivity from linear algebra but will repeat it anyway.

f(E) = f(n E/n) = f(E/n + ... + E/n) = n f(E/n) or 1/n f(E) = f(E/n). f(m E/n) = f(E/n + ... E/n) or m/n f(E) = f(m/n E) if m <= n to ensure we are dealing with effects.

Will extend the definition to any positive operator E. If E is a positive operator a n and an effect E1 exists E = n E1 as easily seen by the fact effects are positive operators with trace <= 1. f(E) is defined as nf(E1). To show well defined suppose nE1 = mE2. n/n+m E1 = m/n+m E2. f(n/n+m E1) = f(m/n+m E1). n/n+m f(E1) = m/n+m f(E2) so nf(E1) = mf(E2).

From the definition its easy to see for any positive operators E1, E2 f(E1 + E2) = f(E1) + f(E2). Then similar to effects show for any rational m/n f(m/n E) = m/n f(E).

Now we want to show continuity to show true for real's.

If E1 and E2 are positive operators define E2 < E1 as a positive operator E exists E1 = E2 + E. This means f(E2) <= f(E1). Let r1n be an increasing sequence of rational's whose limit is the irrational number c. Let r2n be a decreasing sequence of rational's whose limit is also c. If E is any positive operator r1nE < cE < r2nE. So r1n f(E) <= f(cE) <= r2n f(E). Thus by the pinching theorem f(cE) = cf(E).

Extending it to any Hermitian operator H.

H can be broken down to H = E1 - E2 where E1 and E2 are positive operators by for example separating the positive and negative eigenvalues of H. Define f(H) = f(E1) - f(E2). To show well defined if E1 - E2 = E3 - E4 then E1 + E4 = E3 + E1. f(E1) + f(E4) = f(E3) + f(E1). f(E1) - f(E2) = f(E3) - f(E4). Actually there was no need to show uniqueness because I could have defined E1 and E2 to be the positive operators from separating the eigenvalues, but what the heck - its not hard to show uniqueness.

Its easy to show linearity wrt to the real's under this extended definition.

Its pretty easy to see the pattern here but just to complete it will extend the definition to any operator O. O can be uniquely decomposed into O = H1 + i H2 where H1 and H2 are Hermitian. f(O) = f(H1) + i f(H2). Again its easy to show linearity wrt to the real's under this new definition then extend it to linearity wrt to complex numbers.

Now the final bit. The hard bit - namely linearity wrt to any operator - has been done by extending the f defined on effects. The well known Von Neumann argument can be used to derive Born's rule. But for completeness will spell out the detail.

First its easy to check <bi|O|bj> = Trace (O |bj><bi|).

O = ∑ <bi|O|bj> |bi><bj| = ∑ Trace (O |bj><bi|) |bi><bj|

Now we use the linearity that the forgoing extensions of f have led to.

f(O) = ∑ Trace (O |bj><bi|) f(|bi><bj|) = Trace (O ∑ f(|bi><bj|)|bj><bi|)

Define P as ∑ f(|bi><bj|)|bj><bi| and we have f(O) = Trace (OP).

P, by definition, is called the state of the quantum system. The following are easily seen. Since f(I) = 1, Trace (P) = 1. Thus P has unit trace. f(|u><u|) is a positive number >= 0 since |u><u| is an effect. Thus Trace (|u><u| P) = <u|P|u> >= 0 so P is positive.

Hence a positive operator of unit trace P, the state of the system, exists such that the probability of Ei occurring in the POVM E1, E2 ... is Trace (Ei P).

To derive Ballentine's two axioms we need to define what is called a resolution of the identity which is POVM that is disjoint. Such are called Von Neumann observations. We know from the Spectral theorem Hermitian operators, H, can be uniquely decomposed into resolutions of the idenity H = ∑ yi Ei. So what we do is given any observation based on a resolution of the identity Ei we can associate a real number yi with each outcome and uniquely define a Hermitian operator O = ∑ yi Ei, called the observable of the observation.

This gives the first axiom found in Ballentine - but the wording I will use will be slightly different because of the way I have presented it which is different to Ballentine - eg he doesn't point out he is talking about Von Neumann measurements, but measurements in general are wider than that, although all measurements can be reduced to Von Neumann measurements by considering a probe interacting with a system - but that is another story.

Axiom 1
Associated with each Von Neumann measurement we can find a Hermitian operator O, called the observations observable such that the possible outcomes of the observation are its eigenvalues yi.

Axiiom 2 - called the Born Rule
Associated with any system is a positive operator of unit trace, P, called the state of the system, such that expected value of of the outcomes of the observation is Trace (PO).

Axiom 2 is easy to see from what I wrote previously E(O) = ∑yi probability (Ei) = ∑yi Trace (PEi) = Trace (PO).

Now using these two axioms Ballentine develops all of QM.

A word of caution however. Other axioms are introduced as you go - but they occur in a natural way. Schroedinger's equation is developed from probabilities being invariant between frames ie the Principle Of Relativity. That the state after a filtering type observation is an eigenvalue of the observable is a consequence of continuity.

Obviously a lot more can be said, but will leave it for now - its a lot to digest already.

Thanks
Bill
 
Last edited:
  • Like
Likes Leo1233783
  • #138
@kith, I'm not sure if one can escape successive measurements in a relativistic context. If a pair of measurements is done simultaneously at spacelike separation in one frame, then the measurements will be non-simultaneous in a different frame.

Some examples regarding collapse and the relativity of simultaneity:
http://arxiv.org/abs/0706.1232
http://arxiv.org/abs/1007.3977
 
Last edited:
  • #139
kith said:
The state is present in all descriptions, collapse not.


Ballentine comments on this in chapter 9.3, where he critisizes different notions of collapse. He writes
"(ii) The observer causes the reduction process (9.9) when he reads the result of the measurement from the apparatus.
This is really just a variant of (i) with the observer, rather than the apparatus, causing the disturbance, and it is refuted simply by redefining (II) [the measurement apparatus] to include both the apparatus and the observer."

/edit: just to state this clearly - your notion of collapse implies that consciousness causes collapse?
So Ballentine does effectively have "collapse" he just likes not to name it, and to "hide" it in another postulate. He also has a cut because he has observers.

/edit Sorry: just saw two more later posts. Important ones!
 
  • #140
gill1109 said:
So Ballentine does effectively have "collapse" he just likes not to name it, and to "hide" it in another postulate. He also has a cut because he has observers.

Just to be sure Kith's comment is Ballentines rejection of conciousness causes collapse which he specifically denies - as he should - it leads to all sorts of unnecessary complications.

Ballentine specifically associates a state with preparation procedures right from the outset (specifically an ensemble of similarly prepared systems) - see section 2.1 - but he goes into it in more detail in Chapter 9 where Kith got his info from. There he critiques other views such as collapse and objective reality of the state. Also I will state from the outset I do not agree with all his criticisms eg he doesn't really understand Copenhagen - but that is a thread in its own right.

His view is the state divides preparation procedures into equivalence classes of the same state and the state is identified with those equivalence classes. For filtering type observations (these are observations where the object is not destroyed and is in fact a state preparation procedure) it simply changes the equivalence class the system belongs to. Of course the state changes so in that sense collapse has occurred - but it doesn't mean anything - all you have done is changed the preparation of the system - so obviously the state changes. Its simply because you have prepared it differently.

Thanks
Bill
 
Last edited:

Similar threads

  • Quantum Interpretations and Foundations
Replies
34
Views
2K
  • Quantum Interpretations and Foundations
2
Replies
47
Views
2K
  • Quantum Interpretations and Foundations
Replies
8
Views
2K
  • Quantum Interpretations and Foundations
Replies
1
Views
744
  • Quantum Interpretations and Foundations
Replies
3
Views
2K
  • Quantum Interpretations and Foundations
Replies
11
Views
2K
  • Quantum Interpretations and Foundations
Replies
5
Views
2K
  • Quantum Interpretations and Foundations
Replies
11
Views
1K
  • Quantum Interpretations and Foundations
4
Replies
120
Views
10K
  • Quantum Interpretations and Foundations
5
Replies
174
Views
10K
Back
Top