Is quantum collapse an interpretation?

In summary: This is correct. The minimal interpretation does not require the assumption of instantaneous collapse.
  • #71
stevendaryl said:
In the case of multiple filters, I don't see how von Neumann filters help explain intuitively what's going on.

You have a sequence of filters, the first oriented at angle 0o, the next at 45o, the next at 90o. Some fraction of the photons, 12.5%, will make it through all 3 filters. You can't understand that in terms of removing photons. Because after the first filter, you've removed all the photons that were polarized at 90o, but at the end, you still have some photons that were polarized at 90o.

I know that you know how to calculate this case, but the words don't actually match up with the facts. At least, not in my mind. The way that you describe von Neumann filters, in terms of throwing out unwanted stuff, sounds like you're selecting based on pre-existing properties, but that's not the case.
Well, isn't this very simple? In my opinion you should even introduce the Hilbert-space structure with this example (although I'd prefer the spin of a massive particle since photons shouldn't be treated in QM 1).

Take a single-photon source (e.g., some down-converter crystal and a laser, absorb one of the photons; this gives you perfectly unpolarized single-photons). Then run the corresponding beam through the polarizer oriented at an angle ##\alpha=0##. Then you prepare single photons with the state
$$\hat{\rho}_1=|\alpha=0 \rangle \langle \alpha=0|.$$
According to Born's rule you get on average half the photons through.

Now put another polarizer at angle ##\alpha \neq 0## into the remaining beam of photons. Then the probability for each photon to get through is
$$\langle \alpha|\hat{\rho}_1|\alpha \rangle=|\langle \alpha|\alpha=0 \rangle|^2=\cos^2 \alpha.$$
That's it according to QT. You can't know more about the polarization of the photons. Of course the remaining photons are then in the state
$$\hat{\rho}_2=|\alpha \rangle \langle \alpha|,$$
because by construction you've made a preparation through a von Neumann filter measurement. Of course in each step you get less photons, and then you normalize the state again. So in the experiment with the two polarizers in series you get only about ##(1/2) \cos^2 \alpha## of the original intensity (of the original number of single photons).

The same holds true for a third filter. As long as the relative angle between the two filters is not ##\pi/2##, you have some probability to get photons through, namely ##\cos^2(\alpha_1-\alpha_2)##, when ##\alpha_1## and ##\alpha_2## are the angles of the polarizer orientation (relative to some fixed axis of course).

What's a total enigma to me is, how you can claim that this simple facts are not matched by the theory. Such experiments are done frequently in any quantum-optics lab, and one would have heard about the failure of QED for such a simple and fundamental experiment with single-photon polarization states.

You seem to have fallen in the trap to consider photons just as little (massless) classical billard-ball like particles. But that's not so. They are quanta, and thus if they are polarized in direction ##\alpha## this does not imply that the cannot be found to be polarized in another direction ##\alpha'##, as long as ##\alpha-\alpha' \neq \pm \pi/2##.

By the way, all this holds of course true for classical electromagnetic fields. If you want a classical analgon for photons, it's way better to think about them as ultra-faint electromagnetic radiation than in terms of classical particles. You get almost always the correct results, although there are of course the well-known exceptions, which truly prove the necessity to quantize the em. field, like spontaneous emission, quantum beats, etc.
 
Physics news on Phys.org
  • #72
zonde said:
Projection does not only change the length of the vector but it's direction as well. Blocking of unwanted beam gives you only intensity drop but it does not give you the change of spin states for particles in remaining beam. You can not split the beam in two beams where in one beam you have half-particles with spin up state and in the other beam half-particles with spin down state.
I did once the Stern-Gerlach experiment successfully in my enforced lab as an undergraduate student. So I proved to be able to do this ;-)).

I take a Stern-Gerlach apparatus and prepare a beam of particles with ##\sigma_z=+1/2##. Then I consider ##\sigma_x##. As it turns out, in this partial beam I have 1/2 of the particles with ##\sigma_x=+1/2## and the other 1/2 with ##\sigma_x=-1/2##. Of course, this is to be taken in a statistical sense since due to the preparation in a definite ##\sigma_z=+1/2## state, ##\sigma_x## is indetermined, and the probabilities for either of the possible outcomes is ##1/2##. So I cannot say from each individual particle, what I will find when I measure ##\sigma_x##, because ##\sigma_x## is indetermined. The minimal interpretation makes all this very simple. You only have to get used to it!
 
  • #73
tom.stoer said:
I mentioned Everett in my first post :-)

Ah, I see.

tom.stoer said:
For a realist it's not the question whether it makes sense but whether it describes reality. Of course this is problematic b/c you can never (experimentally) falsify a theory which is talking about "what is really happening between observations". Nevertheless there are realists believing in our theories to describe essential aspects of reality.

By making sense, I mean whether the Born rule can be derived from unitary evolution, as required by the Everett interpretation.
 
  • #74
atyy said:
By making sense, I mean whether the Born rule can be derived from unitary evolution, as required by the Everett interpretation.
Why? It would be nice, but why does it have to be derived? Why couldn't it be a fundamental law of nature on its own.
 
  • #75
vanhees71 said:
I take a Stern-Gerlach apparatus and prepare a beam of particles with ##\sigma_z=+1/2##. Then I consider ##\sigma_x##. As it turns out, in this partial beam I have 1/2 of the particles with ##\sigma_x=+1/2## and the other 1/2 with ##\sigma_x=-1/2##. Of course, this is to be taken in a statistical sense since due to the preparation in a definite ##\sigma_z=+1/2## state, ##\sigma_x## is indetermined, and the probabilities for either of the possible outcomes is ##1/2##. So I cannot say from each individual particle, what I will find when I measure ##\sigma_x##, because ##\sigma_x## is indetermined. The minimal interpretation makes all this very simple. You only have to get used to it!

With all due respect, you are using wrong classical notions and conceptions. If at all, you are only allowed to say something like this: “So I cannot say from each individual "particle", what I will find when I measure σx, because there is nothing like "A σx" before I perform a measurement.” A superposition means “quantum ignorance” and within the framework of our classical notions and conceptions there is no language to describe what "quantum ignorance" is.
 
  • #76
No. If I have some quantum system prepared in a pure state, described by a statistical operator which is a projector, ##\hat{\rho}=|\psi \rangle \langle \psi|##, I have complete knowledge about the state of the system. There are some observables that are determined, i.e., have definite values. These are precisely the observables that are described by self-adjoint operators, for which ##|\psi \rangle## is a eigenvector. The corresponding eigenvlaue is the definite value this observable takes. Any other observable hasn't a determined value, and the preparation in the pure state implies the probabilities for finding one of the possible values of the observable when measuring it by using Born's rule.

Also it's clear that your idea about "quantum ignorance" doesn't make sense, because which state are you defining to be a superposition and which not? Any vector in Hilbert space can be written as some superposition of other vectors!
 
  • #77
martinbn said:
Why? It would be nice, but why does it have to be derived? Why couldn't it be a fundamental law of nature on its own.
It can be a law of nature, but not in Everett's context. As I explained the non-unitary and stochastic collaps contradicts the unitary and deterministic time evolution; they cannot be both real "in the same sense".
 
  • #78
vanhees71 said:
No. If I have some quantum system prepared in a pure state, described by a statistical operator which is a projector, ##\hat{\rho}=|\psi \rangle \langle \psi|##, I have complete knowledge about the state of the system. There are some observables that are determined, i.e., have definite values. These are precisely the observables that are described by self-adjoint operators, for which ##|\psi \rangle## is a eigenvector. The corresponding eigenvlaue is the definite value this observable takes. Any other observable hasn't a determined value, and the preparation in the pure state implies the probabilities for finding one of the possible values of the observable when measuring it by using Born's rule.

Also it's clear that your idea about "quantum ignorance" doesn't make sense, because which state are you defining to be a superposition and which not? Any vector in Hilbert space can be written as some superposition of other vectors!

With all due respect, all this says at the end nothing about what “quantum theory” tries to tell us. To use some mathematics doesn't mean to "understand" the semantics. As long as you are “bogged down” in Einstein’s world view, there is no way out.
 
  • #79
tom.stoer said:
It can be a law of nature, but not in Everett's context. As I explained the non-unitary and stochastic collaps contradicts the unitary and deterministic time evolution; they cannot be both real "in the same sense".
But in Everett's context you don't have a collapse.

Also, you stated this claim a few times but you didn't explain why you cannot have a theory that uses both.
 
  • #80
vanhees71 said:
Well, isn't this very simple?

Yes, the mathematics is simple enough, but it is not consistent with the words you use in describing the minimal interpretation. You have on the one hand, the mathematical description of what's going on, and on the other hand, you have your description of measurements as "selection". The words contradict the mathematics.
 
  • #81
stevendaryl said:
Yes, the mathematics is simple enough, but it is not consistent with the words you use in describing the minimal interpretation. You have on the one hand, the mathematical description of what's going on, and on the other hand, you have your description of measurements as "selection". The words contradict the mathematics.

You can say that the minimal interpretation is just that it says: If you perform such and such experiment, you will get such-and-such an outcome with such-and-such probability. That's a truly minimal interpretation. But if you go on to make a claim such as:
  • A quantum measurement is just a matter of updating our information.
you've gone beyond the minimal interpretation, and in my opinion, that claim is not justified by anything we know about quantum mechanics.
 
  • Like
Likes Demystifier
  • #82
martinbn said:
But in Everett's context you don't have a collapse.
And that's the reason why in Everett's Interpretation the Born rule is not a fundamental law of nature but must be derived.

The postulates of Everett's interpretation are (quoted from Sean Carroll)
  1. Quantum states are represented by wave functions, which are vectors in a mathematical space called Hilbert space.
  2. Wave functions evolve in time according to the Schrödinger equation.
That's it.

Collaps to some state is not part of the postulates, therefore the probability for this state isn't, either.

martinbn said:
Also, you stated this claim a few times but you didn't explain why you cannot have a theory that uses both.
I think I tried tried to explain this a couple of times. And I said that you can have such a theory - "standard textbook quantum mechanics".

But Everett et al. observed that there are mutual incompatible postulates in "standard textbook quantum mechanics". Therefore the set of postulates has to be reduced to a consistent subset; everything else has to be derived as a theorem.
 
Last edited:
  • #83
The notion that a measurement (of say, polarization) is a matter of selection is contradicted by the EPR experiment. You produce a pair of correlated photons. You measure the polarization of one photon, photon A, by passing it through a polarizing filter. Suppose you find that it is vertically polarized after the measurement. We can understand what happened to photon A in terms of an interaction between the photon and the polarizing filter. But afterward, we know that the twin photon, B, is also vertically polarized. Even though photon B didn't undergo interaction with any polarizing filter.

To me the minimal interpretation as described by @vanhees71 is simply self-contradictory when it comes to explaining what's going on. You understand what's happening to the polarization of photon A through local physical interactions. So the fact that photon A is vertically polarized after passing through the filter doesn't mean that it was vertically polarized beforehand. But for photon B, there has been no interaction with anything. So the way to understand photon B is that the measurement is simply a matter of updating information.

That's to me a contradiction. If it's a matter of updating of information, then to me, that means that B's polarization was not changed by the measurement. If B is vertically polarized afterward, and the measurement didn't change its polarization, then it must have been vertically polarized all along. But when it comes to photon A, we're not free to assume that it was vertically polarized all along.

To me, there is something very fishy going on. One resolution is "shut up and calculate", where you don't speculate about what goes on between macroscopic measurements, you only calculate probabilities for measurement outcomes. That's not completely satisfying, but it's consistent. But once you describe what's going on with a measurement as "updating of information", you've gone beyond shut up and calculate, and into interpretation. And I think it's an inconsistent interpretation.
 
  • Like
Likes Eye_in_the_Sky, Demystifier and zonde
  • #84
martinbn said:
Why? It would be nice, but why does it have to be derived? Why couldn't it be a fundamental law of nature on its own.

Pretty much what tom.stoer tried to explain.

The unitary evolution is deterministic, the Born rule is probabilistic, so we cannot apply both at the same time (you can try to apply both at the same time, but you will find that you don't even know which "measurement" to use in applying the Born rule).

So in the standard interpretation, we have either unitary evolution or the Born rule (and collapse), but not both at the same time. Since we have two rules of time evolution, who decides which to use when? The "observer" decides.

But what is an "observer"? Shouldn't the universe and the laws of physics hold even when there were no observers? It is not necessary to answer "yes" - physics is not about what nature is, but what we can say about it. Nonetheless, if we try to answer "yes" to observer-independent physics, then we seem to need something like the Everett interpretation, Bohmian Mechanics, GRW, or retrocausality ...
 
  • Like
Likes tom.stoer
  • #85
atyy said:
So in the standard interpretation, we have either unitary evolution or the Born rule (and collapse), but not both at the same time. Since we have two rules of time evolution, who decides which to use when? The "observer" decides.
Thanks ...

... but let me add the remark that according to Everett et al. - the observer is nothing else but a quantum system, so i) the law of unitary time evolution shall apply to the whole system including the observer state, and ii) it seems strange that the observer system can decide which rules shall apply for himself.
 
  • Like
Likes atyy
  • #86
stevendaryl said:
The notion that a measurement (of say, polarization) is a matter of selection is contradicted by the EPR experiment. You produce a pair of correlated photons. You measure the polarization of one photon, photon A, by passing it through a polarizing filter. Suppose you find that it is vertically polarized after the measurement. We can understand what happened to photon A in terms of an interaction between the photon and the polarizing filter. But afterward, we know that the twin photon, B, is also vertically polarized. Even though photon B didn't undergo interaction with any polarizing filter.
It does not imply that the twin photon is vertically polarized. It means it will measure as vertically polarized if it is measured in the same basis.
So there is no contradiction that I can see,

I wish you could make your arguments without using photons which do not have a wave function like a silver atom (say).
 
  • #87
May be I misunderstand what your claim is. But to me it seems that you are claiming that such a theory cannot exists, and I don't see anything even remotely resembling a proof. Given that there are attempts as GRW, your claim seems in the need of some justification.

As to Everett, if Born's rule is not part of the postulates it doesn't (although it may) mean that it has to be derived from them. It could be logically independent. No one demands that Euclid's fifth postulate has to be derived from the rest. So couldn't that be the case with Born in Everett?
 
  • #88
atyy said:
Pretty much what tom.stoer tried to explain.

The unitary evolution is deterministic, the Born rule is probabilistic, so we cannot apply both at the same time (you can try to apply both at the same time, but you will find that you don't even know which "measurement" to use in applying the Born rule).

So in the standard interpretation, we have either unitary evolution or the Born rule (and collapse), but not both at the same time. Since we have two rules of time evolution, who decides which to use when? The "observer" decides.

In a different thread, I sketched out a way to formalize the Copenhagen interpretation that made explicit the macro/micro cut.

Because it's easier, and I'm more familiar with it, I'm going to describe this using nonrelativistic quantum mechanics, although there might be an analogous interpretation that is relativistic.

Let's suppose that there is such a thing as the "classical configuration" of the universe. For example, it might be specified by splitting up the universe into cells, and giving the mean particle content, energy, angular momentum, field values, etc., in each cell. Given a classical configuration [itex]C_j[/itex], we can determine everything else that is macroscopic: All measurement results and preparation procedures, etc.

Let's suppose that there is a "wave function of the universe", [itex]|\psi\rangle[/itex]. Then corresponding to each classical configuration is a projection operator on the Hilbert space of the universe: [itex]P_j[/itex]. If [itex]|\psi\rangle[/itex] is such that it corresponds to a universe with a definite classical configuration [itex]j[/itex], then [itex]P_j |\psi\rangle = |\psi\rangle[/itex].

Now, we can describe an interpretation where there is both deterministic Hamiltonian evolution and nondeterministic quantum jumps as follows:

We say that at any time [itex]t[/itex], there is an associated wave function [itex]|\psi(t)\rangle[/itex] and there is an associated classical configuration [itex]C_j(t)[/itex].

The wave function evolves smoothly with time: [itex]|\psi(t)\rangle = e^{-iHt} |\psi(0)\rangle[/itex], where [itex]H[/itex] is the Hamiltonian of the whole universe.

The classical configuration, though, evolves nondeterministically, according to this rule:

If the classical configuration initially is [itex]C_j[/itex], then the classical configuration at time [itex]t[/itex] will be [itex]C_k[/itex] with a probability given by:

[itex]P(j,k) = |\langle \psi(0)| e^{i H t} P_k e^{-i H t} P_j|\psi(0)\rangle|^2[/itex]

This interpretation doesn't have collapse, in that the classical configuration has no effect on the wave function [itex]|\psi\rangle[/itex]
 
  • #89
Mentz114 said:
It does not imply that the twin photon is vertically polarized. It means it will measure as vertically polarized if it is measured in the same basis.

I think that's a difference without a difference. Photon B is vertically polarized in exactly the same sense as a photon that has passed through a vertical polarizing filter. All subsequent measurements performed on Photon B will be the same in both cases.
 
  • #90
Mentz114 said:
I wish you could make your arguments without using photons which do not have a wave function like a silver atom (say).

We can use spin-1/2 twin pairs.
 
  • #91
stevendaryl said:
I think that's a difference without a difference. Photon B is vertically polarized in exactly the same sense as a photon that has passed through a vertical polarizing filter. All subsequent measurements performed on Photon B will be the same in both cases.

The point is that in a twin-pair experiment, we perform a measurement on twin A and we learn something about twin B. The question then is: Whatever we now know about twin B, was it true before the measurement, and our measurement is just updating our knowledge, or did it become true through the process of measurement?

It seems that it is inconsistent (given Bell's theorem, if we reject superdeterminism and many-worlds) to claim that we are just updating our knowledge about B.
 
  • #92
martinbn said:
May be I misunderstand what your claim is. But to me it seems that you are claiming that such a theory cannot exists ...
I never said that!

I don't deny the possibility of different interpretations or theories. I am talking about (some aspects of) Everett's Interpretation. Your ideas are not forbidden, but they are not compatible with Everett's.

martinbn said:
Given that there are attempts as GRW, your claim seems in the need of some justification.
There is no such claim.

And GRW try to solve the problem via a modification of the mathematical formalism of quantum mechanics, whereas Everett doesn't.

martinbn said:
As to Everett, if Born's rule is not part of the postulates it doesn't (although it may) mean that it has to be derived from them. It could be logically independent. No one demands that Euclid's fifth postulate has to be derived from the rest. So couldn't that be the case with Born in Everett?
The whole starting point was that one is looking for a consistent set of axioms describing aspects of "reality out there". Logically contradicting axioms would mean that this reality is logically inconsistent. You may add - as the zeroth axiom - that reality is logically consistent.

As far as I can see Everett's Interpretation is the only interpretation which is both realistic with an "ontic interpretation" of the state vector and which is mathematically equivalent to the orthodox of "textbook" quantum mechanics - provided that the apparent collapse and Born's rule follow as theorems.
 
Last edited:
  • #93
tom.stoer said:
The whole starting point was that one is looking for a consistent set of axioms describing aspects "reality out there". Logically contradicting axioms would mean that this reality is logically inconsistent.

I don't think that the Born rule contradicts smooth evolution. As I said in a reply to @atyy, you could have an ontology in which there are two different types of object:
  1. Classical configurations
  2. Wave function
The wave function determines the probability of various classical configurations via the Born rule, but the classical configuration has no effect on the wave function, which always evolves according to Schrodinger's equation.
 
  • #94
stevendaryl said:
I don't think that the Born rule contradicts smooth evolution. As I said in a reply to @atyy, you could have an ontology in which there are two different types of object:
  1. Classical configurations
  2. Wave function
The wave function determines the probability of various classical configurations via the Born rule, but the classical configuration has no effect on the wave function, which always evolves according to Schrodinger's equation.
That goes into the direction of de Broglie–Bohm theory. I haven't seen any attempt to incorporate spin, isospin etc. and to introduce particle creation and annihilation which is really satisfactory. The ontology and the axioms are quite complicated compared to Everett.
 
  • #95
stevendaryl said:
The point is that in a twin-pair experiment, we perform a measurement on twin A and we learn something about twin B. The question then is: Whatever we now know about twin B, was it true before the measurement, and our measurement is just updating our knowledge, or did it become true through the process of measurement?

It seems that it is inconsistent (given Bell's theorem, if we reject superdeterminism and many-worlds) to claim that we are just updating our knowledge about B.
In a projective measurement work is done on the object and the state goes into an eigenstate of the projector. Before that we actually don't know (or care?) what the state was.
As I understand it updating information is what we do but projecting the state of the atom is a well understood physical process independent of 'the state of our knowledge'..
 
  • #96
tom.stoer said:
That goes into the direction of de Broglie–Bohm theory. I haven't seen any attempt to incorporate spin, isospin etc. and to introduce particle creation and annihilation which is really satisfactory. The ontology and the axioms are quite complicated compared to Everett.

Well, I think that Bohmian mechanics is attempting something more ambitious, which is to make the classical configurations deterministic. What I'm suggesting is really nothing more than Copenhagen reinterpreted. It's nondeterministic. But the classical configurations are macroscopic, rather than microscopic. Spin, particles, etc., are not part of the classical configuration, but are part of the microscopic state, which evolves according to Schrodinger's equation, or quantum field theory.
 
  • Like
Likes martinbn
  • #97
Mentz114 said:
In a projective measurement work is done on the object and the state goes into an eigenstate of the projector. Before that we actually don't know (or care?) what the state was.

As I said, it's fine if you are just claiming to be able to predict the probabilities for measurement outcomes, and are silent about ontology. That's a perfectly respectable approach (the "shut up and calculate" interpretation). But if you claim that a measurement is a matter of updating our knowledge, that goes beyond "shut up and calculate", and I don't think it's consistent. If a measurement is only a matter of updating of knowledge, then the polarization of photon B cannot be changed by measurement of photon A, and if you find out that B is vertically polarized, then it must have been vertically polarized immediately before the measurement. So saying it is a matter of updating of knowledge is not consistent with saying "we don't know and don't care what the state was before the measurement". It implies something definite about the state before the measurement.
 
  • #98
stevendaryl said:
As I said, it's fine if you are just claiming to be able to predict the probabilities for measurement outcomes, and are silent about ontology. That's a perfectly respectable approach (the "shut up and calculate" interpretation). But if you claim that a measurement is a matter of updating our knowledge, that goes beyond "shut up and calculate", and I don't think it's consistent. If a measurement is only a matter of updating of knowledge, then the polarization of photon B cannot be changed by measurement of photon A, and if you find out that B is vertically polarized, then it must have been vertically polarized immediately before the measurement. So saying it is a matter of updating of knowledge is not consistent with saying "we don't know and don't care what the state was before the measurement". It implies something definite about the state before the measurement.
I don't follow. You just repeated your original argument in a more convoluted way. Anyway - the only thing I believe is the actual physics of the spin and the field. So your arguments are irrelevant to me seeing as they don't even mention the physics of the experiment.
 
  • #99
Mentz114 said:
I don't follow.

I think that's a choice on your part.

What does it MEAN to say that a measurement is "simply a matter of updating our knowledge"? It seems that people don't intend for that statement to have any implications. So why say it?
 
  • #100
stevendaryl said:
Well, what does it MEAN to say that a measurement is "simply a matter of updating our knowledge"? It seems that people don't intend for that statement to have any implications. So why say it?
I think they mean that
1) the WF does not exist in the way a field exists but is a calculational aid.
2) therefore it cannot collapse, and all that happens to it is that we change it.

The problem is the association of 'collapse' with measurement. Projective measurements don't necessarily 'collapse' anything so in the sense you take for measurement they are not measurements.

In fact, far from collapsing the state, the result is a coherent superposition which can be recombined to the original state ! So nothing changed except the basis.

[I was wrong about the work, I think]
 
  • #101
Lord Jestocost said:
With all due respect, all this says at the end nothing about what “quantum theory” tries to tell us. To use some mathematics doesn't mean to "understand" the semantics. As long as you are “bogged down” in Einstein’s world view, there is no way out.
There is no other way to talk about quantum physics than quantum theory and the only adequate language for it is mathematics, as for all of physics.
 
  • Like
Likes weirdoguy
  • #102
stevendaryl said:
What does it MEAN to say that a measurement is "simply a matter of updating our knowledge"? It seems that people don't intend for that statement to have any implications. So why say it?

What happens in discussions about QM is that there is a core that everyone agrees with. The core is the rules for calculating the answers to questions of the form:
  • If I set up a system in such-and-such a way, and later perform such-and-such a measurement, then what is the probability that I get such-and-such a result?
That's what I consider to be the true minimal interpretation, but it's actually the "shut up and calculate" interpretation. It leaves completely unanswered such questions as:
  1. Is there something special about measurements, compared with other types of interactions?
  2. Does a measurement of one particle of a twin pair affect its twin?
  3. Is measurement revealing a pre-existing property, or does the property come into existence in the process of measurement?
  4. Do parts of an entangled system affect each other nonlocally?
  5. Do particles have properties even when they aren't being measured?
  6. Etc.
Those questions aren't answered by the shut up and calculate interpretation. Maybe some people think that they don't need to be answered, which is a perfectly respectable attitude to take. But if you claim to have an answer to one of the questions I have listed, then you are going beyond the shut up and calculate interpretation.
 
  • Like
Likes Demystifier and Lord Jestocost
  • #103
stevendaryl said:
Yes, the mathematics is simple enough, but it is not consistent with the words you use in describing the minimal interpretation. You have on the one hand, the mathematical description of what's going on, and on the other hand, you have your description of measurements as "selection". The words contradict the mathematics.
This I don't understand. If I use an idealized polarizer, it's a paradigmatic example for a von Neumann filter measurement. What should here contradict the mathematics?

Again: If a system is prepared in a pure state, described by the Statistical Operator ##\hat{\rho}=|\psi \rangle \langle \psi|##, then any observable, which is described by a self-adjoint operator has a determined value if and only if ##|\psi \rangle## is an eigenvector of this self-adjoint operator. If it is not an eigenstate of the operator, the corresponding observable has not a determined value, and the probabilities for measuring a possible value of this observable is given by Born's rule.

This means, if ##|a,\beta \rangle## is a complete orthonormal set of eigenvectors of the self-adjoint operator with eigenvalue ##a##, then the probability to find this possible value ##a## when measuring the corresponding observable, given the system is prepared in the above pure state ##\hat{\rho}## is
$$P(a|\hat{\rho})=\sum_{\beta} \langle a,\beta|\hat{\rho}|a,\beta \rangle=\sum_{\beta} |\langle a,\beta|\psi \rangle|^2.$$
This is the standard formulation. Where is, in your opinion, a contradiction?
 
  • #104
Mentz114 said:
I think they mean that
1) the WF does not exist in the way a field exists but is a calculational aid.

That's going beyond the minimal interpretation. The minimal interpretation doesn't actually say what the nature of the wave function is. To state that it's not real is to make an ontological claim, and I think that it's difficult to make sense of that claim. You can ignore the question about the nature of the wave function, and just say "I don't have a clue". But if you're going to venture into making ontological claims, it seems that you need to more precise about what you're claiming.
 
  • #105
The wave function is a representation of a vector in Hilbert space. The corresponding ray (or projector) represents a pure state. It's a mathematical description of a preparation procedure. In the lab, I don't find Hilbert-space vectors but, e.g., an accelerator (to prepare particles I want to collide) and a bunch of detectors to measure the outcome of collisions between particles, which are other particles, which I sort with respect to species ("particle ID"), energy and momentum and, sometimes, polarization/spin.
 
  • Like
Likes Mentz114
Back
Top