"Filming" a quantum measurement

In summary, the conversation discusses the process of quantum measurement and the smoothness of its outcomes. The participants also mention the role of the Born rule postulate and the Schrödinger equation in understanding this process. The paper being referenced in the conversation presents an experiment that maintains quantum coherence and is not a traditional measurement as it does not involve interaction with another system. The participants also touch upon the concept of open quantum systems and the collapse of the wave function. However, the paper does not specifically mention the smoothness of the measurement process.
  • #1
nomadreid
Gold Member
1,734
231
TL;DR Summary
In an article (link in main text), one "films" a strontium ion in an electric field during the microsecond of its wave function "collapse"; a GIF shows the superposition as it changes. But I thought that such a distribution could only be measured by a lot of atoms/trials, not from a single atom. Predicted, perhaps, if the original state is measured, but that does not seem to be the case here.
The link is https://www.su.se/english/research/scientists-film-a-quantum-measurement-1.487234
How do they arrive at this distribution over time? It does not appear to be saying that this is the prediction (in it were, then why the "filming"?), and measurement of a single atom's superposition is supposed to be impossible, so ...? Obviously there is a basic point I am missing.
 
  • Like
Likes atyy
Physics news on Phys.org
  • #2
They repeat the experimental process (preparing the system in state ##|\psi_i \rangle \langle \psi_i|## then doing the measurement of ##|\psi_j \rangle \langle \psi_j##) for a 1000 times for each setup ##(i,j)##. The paper the website is referring too is much easier to understand than the webpage ;-)).
 
  • Like
Likes nomadreid and atyy
  • #3
Thanks you, vanhees71. That makes more sense than what I thought it said.
 
  • Like
Likes vanhees71
  • #4
I'm often surprised how often it happens that I don't understand a popular-science news article but then am able to understand at least the concepts from the original paper, even when I'm not an expert in the topic presented. I think popular-science writing is among the most difficult tasks for a researcher in the field, but it's even more difficult for a science journalist who is usually not an expert in what he is trying to describe.
 
  • Like
Likes nomadreid
  • #5
I have looked at the original paper but it is largely beyond me. However, would I be right in thinking that as the measurement takes a finite time is there at least the potential for it to have dynamics beyond that modeled by a projection operator ?

Regards Andrew
 
  • #6
Sure, the state never jumps as a function of time. The time evolution is given by the Schrödinger equation for the wave function (or the von Neumann equation for the statistical operator), which is a differential equation in time, leading to a smooth time-dependence of its solutions. There are no quantum jumps in modern quantum mechanics only more or less rapid smooth transitions from one to another state.
 
  • Skeptical
  • Like
Likes Demystifier and andrew s 1905
  • #7
vanhees71 said:
Sure, the state never jumps as a function of time. The time evolution is given by the Schrödinger equation for the wave function (or the von Neumann equation for the statistical operator), which is a differential equation in time, leading to a smooth time-dependence of its solutions. There are no quantum jumps in modern quantum mechanics only more or less rapid smooth transitions from one to another state.

Thank you. So would a reasonable image be that the measuring system when included in say the Schrodinger equation / Hamiltonian adds terms that lead to the transition from one state to the other and entangles them in the process?

Regards Andrew
 
  • Like
Likes vanhees71
  • #8
Yes, that's a right picture!
 
  • Like
Likes andrew s 1905
  • #9
vanhees71 said:
Sure, the state never jumps as a function of time. The time evolution is given by the Schrödinger equation for the wave function (or the von Neumann equation for the statistical operator), which is a differential equation in time, leading to a smooth time-dependence of its solutions. There are no quantum jumps in modern quantum mechanics only more or less rapid smooth transitions from one to another state.
The Schrödinger evolution is smooth and deterministic. But the evolution also has a random non-deterministic component. Is the non-deterministic component of evolution also smooth? If yes, how do we know that?
 
  • #10
Where is a random non-deterministic component in the Schrödinger equation,
$$\mathrm{i} \partial_t \psi(t,\vec{x})=-\frac{1}{2m} \Delta \psi(t,\vec{x}) + V(\vec{x}) \psi(t,\vec{x})?$$
I'm not talking about open quantum systems in some Schrödinger-Langevin approach!
 
  • #11
vanhees71 said:
Where is a random non-deterministic component in the Schrödinger equation
In the Born rule postulate, i.e. in the probabilistic interpretation of the solution of the Schrödinger equation. In measurements, observables acquire definite values. This acquirement of a definite value is a non-deterministic process. My question is: Is this process smooth or not?
 
  • Like
Likes nomadreid
  • #12
The state is given by the wave function, and this is evolving smoothly with time. What's "random" are the outcome of measurements, not the states!
 
  • Like
Likes nomadreid
  • #13
vanhees71 said:
What's "random" are the outcome of measurements
Right! And my question is: Do measurement outcomes arise smoothly or not?
 
  • #14
How do you define "measurement outcomes arise smoothly"? You have to let the system you want to measure with some measurement device, where some macroscopic observable (i.e., some average) delivers the "outcome". This may have some fluctuations around this average value, but they should be negligible on the macroscopic scale of resolution to provide a "unique" "measurement outcome". This macroscopic "pointer observable" also doesn't jump.
 
  • #15
Demystifier said:
In measurements, observables acquire definite values.

It does not look like the experiment under discussion in this thread does this. The process described in the paper maintains quantum coherence, so it is not a "measurement" in the usual sense, and the term "measurement" in the title is really a misnomer.
 
  • Like
Likes nomadreid and vanhees71
  • #16
vanhees71 said:
I'm not talking about open quantum systems

A quantum system that is not open does not interact with anything, which means it can't be measured, since measurement requires interaction.

vanhees71 said:
What's "random" are the outcome of measurements, not the states!

Once you know a measurement outcome, you have to discontinuously change the state. (This process is usually referred to by the "c" word.)
 
  • Like
Likes vanhees71
  • #17
Now we get into an discussion of interpretation again. Mathematically nothing jumps. Also the effective equations for open quantum systems (quantum master equations) are differential equations (like, e.g., Kadanoff-Baym or Lindblad equations). The "collapse" may look like "instaneous", but in reality it's just a rapid smooth change.
 
  • #18
vanhees71 said:
Now we get into an discussion of interpretation again.

No, what I'm referring to is not what story any interpretation tells about the mathematical process referred to by the "c" word. I'm referring to the mathematical process itself, which is part of the 7 Basic Rules (rule 7, the projection postulate).

vanhees71 said:
the effective equations for open quantum systems (quantum master equations) are differential equations (like, e.g., Kadanoff-Baym or Lindblad equations). The "collapse" may look like "instaneous", but in reality it's just a rapid smooth change.

Can you point out where in the paper on the experiment under discussion this "smooth process" is described?
 
  • #19
Eq. (5) and (6) of the Supplement. It's a pretty simple exactly solvable differential equation (known as the rotating-wave approximation). Nothing jumps!
 
  • #20
I am trying to follow the discussion. Given
PeterDonis said:
The process described in the paper maintains quantum coherence, so it is not a "measurement" in the usual sense, and the term "measurement" in the title is really a misnomer.

and the fact the paper defines an "Ideal quantum measurement " in the synopsis what is the difference? (I counted the term "measurement" 55 times in the paper.)

From what has been said above is a classical QM measurement one where the measuring device is treated classically (modeled by a projection operator) and an "Ideal QM"
one where the measurement device is also modeled as quantum mechanical or am I totally mistaken?

Regards Andrew
 
  • Like
Likes nomadreid
  • #21
vanhees71 said:
Eq. (5) and (6) of the Supplement. It's a pretty simple exactly solvable differential equation (known as the rotating-wave approximation). Nothing jumps!

Nor is anything measured, in the usual sense of "measurement" where an irreversible record is made, or, equivalently, where decoherence takes place. All that is being described by those equations is a unitary interaction between a laser and a trapped ion. There is no decoherence and no irreversible record being made. That is why I said that the term "measurement" in this context is a misnomer.

The actual "measurement" in this case is the detection of fluorescence photons. The state described by the master equations you refer to only predicts the probability for such detection, just as with any quantum state. The actual detection process itself is not even described in the paper or the supplemental material; the authors, I take it, assume, as is normal in such papers, that everyone already understands how the detection of fluorescence photons works, that it is an irreversible, decoherent process, and that the quantum states they describe in their paper only predict the probabilities for such detections.
 
  • Like
Likes nomadreid, meekerdb and vanhees71
  • #22
andrew s 1905 said:
From what has been said above is a classical QM measurement one where the measuring device is treated classically (modeled by a projection operator) and an "Ideal QM"
one where the measurement device is also modeled as quantum mechanical

No. See my response to @vanhees71 in post #21.
 
  • #23
nomadreid said:
one "films" a strontium ion in an electric field during the microsecond of its wave function "collapse"

This is a misleading description of what the experiment described in the paper is actually testing. A much better explanation of the point of the experiment is given in the Introduction to the paper:

What is an ideal measurement in quantum mechanics? What are its inner workings? How does the quantum state change because of it? These have been central questions in the development of quantum mechanics[1]. Notably, today’s accepted answer to the latter question is conceptually different from the one given in the first formalization of quantum mechanics by von Neumann[2]. Then, it was thought that an ideal measurement on aquantum system would inevitably destroy all quantum superpositions. Later, Lüders pointed out[3]that certain superpositions should survive, so that a sequence of ideal measurements would preserve quantum coherence. Lüders’s version is the one accepted today.

In other words, what this experiment is demonstrating is simply that a measurement on a quantum system only destroys superpositions (a better term would be "destroys quantum coherence") for the degrees of freedom involved in the measurement. In this case, only one of the three possible states of the trapped ion is involved in the measurement (i.e., has a nonzero interaction with whatever is being used to make the measurement), so the measurement only destroys quantum coherence that involves that one state--quantum coherence between the other two states is preserved. Or, to put it another way, since only one of the three states is "being measured", only quantum coherence involving that state is lost.

The "filming" part comes in because the measurement of the one state that is "being measured" (the state called ##|0\rangle## in the paper) only actually counts as a "measurement" if a fluorescence photon is detected; if no fluorescence photon is detected, then no "measurement" has taken place. Since the probability of detection of a fluorescence photon varies according to the interaction strength that is used, one can run the experiment multiple times with different interaction strengths to take what amount to "snapshots" of the process of loss of quantum coherence involving the state ##|0\rangle##. Very high interaction strength corresponds to the limit in which the probability of detection of a fluorescence photon approaches 1, and quantum coherence involving the state ##|0\rangle## is completely lost; very low interaction strength is the limit in which the probability of detection approaches 0, and quantum coherence is not affected at all. In between interaction strengths correspond to partial loss of quantum coherence.

Note, however, that this "partial" loss of quantum coherence is not a property of a single run of the experiment; it is only a property of an ensemble of runs all made with the same (intermediate) interaction strength. In any single run, either a fluorescence photon is detected or it is not. If it is, quantum coherence is lost; if it is not, quantum coherence is not lost. So there is no "partial" loss of coherence in any single run of the experiment, and there is also no "partial measurement" in any single run; no single run involves the measurement process being "caught" part way through. In any single run, either the measurement happens all the way, or it doesn't happen at all.
 
  • Like
Likes nomadreid and meekerdb
  • #24
Of course, the measurement process itself, in this case, the detection of the fluorescence photons, is taken for granted. What I don't understand is, why you don't consider the detection of these photons not a measurement. If I understand the paper right Fig. 2 are actually real measurement results, which demonstrate that in case (d) you indeed have made an FAPP ideal "Lüders measurement" as decribed in Fig. 1 (b). I don't see, why you think it's misleading to call it a measurement (are you implying the authors are cheating?).
 
  • #25
andrew s 1905 said:
So would a reasonable image be that the measuring system when included in say the Schrodinger equation / Hamiltonian adds terms that lead to the transition from one state to the other and entangles them in the process?

The paper's use of the term "measurement" in this context is very misleading. In Fig. 1, for example, the "measurement process" is shown as a brief pulse of the 422 nm laser that couples the ##|0\rangle## state of the atom to the photon states in the cavity, giving a nonzero probability of emission of a photon by the atom. But then, later on, we have "fluorescence detection", and in the paper's Conclusion, this step is described as "a measurement of the photon environment":

The system that is subject to the measurement is brought into contact with a macroscopic pointer state by facilitating a strong interaction between state ##|0\rangle## of the system and the photon environment. A measurement of the photon environment then reveals the state of the system.

So we measure "the system" by letting it interact with "the photon environment", and then...measuring the photon environment. In other words, the label "measurement process" for the interaction between the system (the atom) and the photon environment is a misnomer; that's not where the actual "measurement" occurs. The actual "measurement" is the "measurement of the photon environment", which, as I noted in an earlier post, is not described mathematically in the paper at all.
 
  • Like
Likes nomadreid
  • #26
vanhees71 said:
Of course, the measurement process itself, in this case, the detection of the fluorescence photons, is taken for granted. What I don't understand is, why you don't consider the detection of these photons not a measurement.

I never said the detection of the fluorescence photons is not a measurement. I said the exact opposite: that is the measurement. But the 422 nm laser pulse that is labeled "measurement process" in Fig. 1 of the paper is not the detection of the fluorescence photons (the figure makes that obvious, as I described in post #25 just now). So the 422 nm laser pulse is not a measurement. It's just a unitary interaction that enables a (possible--whether or not a fluoresecence photon is actually detected is probabilistic) measurement later on in the experiment.

vanhees71 said:
If I understand the paper right Fig. 2 are actually real measurement results,

No, they're not, they're "reconstructed from experimental data", just like it says in the caption of the figure. The "experimental data" is just the record of fluorescence photon detections or non-detections for each run of the experiment. That is the "measurement results". What is shown in Fig. 2 is a model-dependent calculation, just like the figure's caption says. A model-dependent calculation is not the same as "real measurement results".

vanhees71 said:
I don't see, why you think it's misleading to call it a measurement

See my post #25 and my comments above about the 422 nm laser pulse.

vanhees71 said:
(are you implying the authors are cheating?).

No. Such misleading use of the term "measurement" is, unfortunately, common in the literature. But that doesn't make it any less misleading.
 
  • Like
Likes nomadreid
  • #27
vanhees71 said:
How do you define "measurement outcomes arise smoothly"? ... This macroscopic "pointer observable" also doesn't jump.
So the macroscopic pointer evolves with time smoothly but non-deterministically, would you agree?
 
  • Like
Likes vanhees71
  • #28
PeterDonis said:
So we measure "the system" by letting it interact with "the photon environment", and then...measuring the photon environment. In other words, the label "measurement process" for the interaction between the system (the atom) and the photon environment is a misnomer; that's not where the actual "measurement" occurs. The actual "measurement" is the "measurement of the photon environment", which, as I noted in an earlier post, is not described mathematically in the paper at all.
What's described is a FAPP ideal filter measurement. The measurement is through coupling of the measured system to the "photon environment" (a bit strange an expression for coupling to the electromagnetic quantum field). For strong enough coupling you realize an (almost) ideal filter measurement. The subsequent verification that you realized this filter through interaction with the "photon environment" is just a standard measurement, which is well established. It's not the aim of the paper to also describe this measurement mathematically. I'm pretty sure that this is done in the standard literature of the AMO community.
 
  • #29
PeterDonis said:
I never said the detection of the fluorescence photons is not a measurement. I said the exact opposite: that is the measurement. But the 422 nm laser pulse that is labeled "measurement process" in Fig. 1 of the paper is not the detection of the fluorescence photons (the figure makes that obvious, as I described in post #25 just now). So the 422 nm laser pulse is not a measurement. It's just a unitary interaction that enables a (possible--whether or not a fluoresecence photon is actually detected is probabilistic) measurement later on in the experiment.
The 422 nm laser pulse is a measurement. In the case of strong enough intensities it's even an ideal filter measurement, as is demonstrated in the supplement. For not so high intensities it's not an ideal filter measurement but one with some uncertainty, which is quantified by the corresponding transition probabilities. Formally, it's indeed a "measurement", where you don't take note of the result first. That's done with the detection of the fluorescence photons.
No, they're not, they're "reconstructed from experimental data", just like it says in the caption of the figure. The "experimental data" is just the record of fluorescence photon detections or non-detections for each run of the experiment. That is the "measurement results". What is shown in Fig. 2 is a model-dependent calculation, just like the figure's caption says. A model-dependent calculation is not the same as "real measurement results".
If you want to verify that you really have prepared a certain quantum state you have to measure something (here the fluorescence photons) and reconstruct the state from the (statistical) data. What else should a "state-determination measurement" be?
 
  • #30
Demystifier said:
So the macroscopic pointer evolves with time smoothly but non-deterministically, would you agree?
If you mean by "macroscopic pointer" some macroscopic observable (like a pointer position), I agree, because indeed it's an average (over some macroscopic small but microscopic large space-time interval), and there are (quantum as well as thermal) fluctuations around the mean value.
 
  • #31
vanhees71 said:
If you mean by "macroscopic pointer" some macroscopic observable (like a pointer position), I agree, because indeed it's an average (over some macroscopic small but microscopic large space-time interval), and there are (quantum as well as thermal) fluctuations around the mean value.
But the average evolves deterministically, due to the Ehrenfest theorem. And thermal fluctuations can in principle be eliminated, by performing the experiment at zero temperature. So basically, the randomness of measurement outcomes arises from quantum fluctuations, I think you would agree with that. My problem is this: How do we know that the effect of these quantum fluctuations is smooth?
 
  • Like
Likes vanhees71
  • #32
I agree. What's smooth are the probability distributions, whose time evolution on the fundamental level is defined by differential equations (von Neumann equation for the stat. op.).

Of course, if you go to effective descriptions of fluctuations a la Langevin you describe them by stochastic differential equations.
 
  • #33
vanhees71 said:
What's smooth are the probability distributions, whose time evolution on the fundamental level is defined by differential equations (von Neumann equation for the stat. op.).
Yes, but this evolution is also deterministic. I am interested in the random non-deterministic evolution.

vanhees71 said:
Of course, if you go to effective descriptions of fluctuations a la Langevin you describe them by stochastic differential equations.
Fine, this effective description suggests that the random time evolution is smooth. But another effective description based on quantum jumps suggests that it isn't smooth. So it seems that effective descriptions alone cannot decide. Is there a more fundamental argument for the thesis that the random time evolution is smooth?
 
  • #34
vanhees71 said:
The state is given by the wave function, and this is evolving smoothly with time. What's "random" are the outcome of measurements, not the states!
Do those outcomes arise smoothly or instantaneously? How would they arise smoothly if measuring the position of a particle means that the particle is found at a classically impossible location, e.g. on the other side of a barrier(quantum tunneling).
Does the term smoothly as you are using it include instantaneous jumps?
 
  • #35
Demystifier said:
But the average evolves deterministically, due to the Ehrenfest theorem. And thermal fluctuations can in principle be eliminated, by performing the experiment at zero temperature. So basically, the randomness of measurement outcomes arises from quantum fluctuations, I think you would agree with that. My problem is this: How do we know that the effect of these quantum fluctuations is smooth?
It seems to me that by definition a macroscopic pointer obeys classical physics and hence would have to respond smoothly to any fluctuations.

Regards Andrew
 

Similar threads

Back
Top