Quantum Physics via Quantum Tomography: A New Approach to Quantum Mechanics
This Insight article presents the main features of a conceptual foundation of quantum physics with the same characteristic features as classical physics – except that the density operator takes the place of the classical phase space coordinates position and momentum. Since everything follows from the well-established techniques of quantum tomography (the art and science of determining the state of a quantum system from measurements) the new approach may have the potential to lead in time to a consensus on the foundations of quantum mechanics. Full details can be found in my paper
- A. Neumaier, Quantum mechanics via quantum tomography, Manuscript (2022). arXiv:2110.05294v5
This paper gives for the first time a formally precise definition of quantum measurement that
- is applicable without idealization to complex, realistic experiments;
- allows one to derive the standard quantum mechanical machinery from a single, well-motivated postulate;
- leads to objective (i.e., observer-independent, operational, and reproducible) quantum state assignments to all sufficiently stationary quantum systems.
- The new approach shows that the amount of objectivity in quantum physics is no less than that in classical physics.
A modified version of the above manuscript appeared as Part II of my new book
- A. Neumaier and D. Westra, Algebraic Quantum Physics, Vol. 1: Quantum mechanics via Lie algebras, de Gruyter, Berlin 2024.
The following is an extensive overview of the most important developments in this new approach.
$$
\def\<{\langle} % expectation \def\>{\rangle} % expectation
\def\tr{{\mathop{\rm tr}\,}}
\def\E{{\bf E}}
$$
Table of Contents
Quantum states
The (Hermitian and positive semidefinite) density operator ##\rho## is taken to be the formal counterpart of the state of an arbitrary quantum source. This notion generalizes the polarization properties of light: In the case of the polarization of a source of light, the density operator represents a qubit and is given by a ##2\times 2## matrix whose trace is the intensity of the light beam. If expressed as a linear combination of Pauli matrices, the coefficients define the so-called Stokes vector. Its properties (encoded in the mathematical properties of the density operator) were first described by George Stokes (best known from the Navier-Stokes equations for fluid mechanics) who gave in 1852 (well before the birth of Maxwell’s electrodynamics and long before quantum theory) a complete description of the polarization phenomenon, reviewed in my Insight article ‘A Classical View of the Qubit‘. For a stationary source, the density operator is independent of time.
The detector response principle
A quantum measurement device is characterized by a collection of finitely many detection elements labeled by labels ##k## that respond statistically to the quantum source according to the following detector response principle (DRP):
- A detection element ##k## responds to an incident stationary source with density operator ##\rho## with a nonnegative mean rate ##p_k## depending linearly on ##\rho##. The mean rates sum to the intensity of the source. Each ##p_k## is positive for at least one density operator ##\rho##.
If the density operator is normalized to intensity one (which we shall do in this exposition) the response rates form a discrete probability measure, a collection of nonnegative numbers ##p_k## (the response probabilities) that sum to 1.
The DRP, abstracted from the polarization properties of light, relates theory to measurement. By its formulation it allows one to discuss quantum measurements without the need for quantum mechanical models for the measurement process itself. The latter would involve the detailed dynamics of the microscopic degrees of freedom of the measurement device – clearly out of the scope of a conceptual foundation on which to erect the edifice of quantum physics.
The main consequence of the DRP is the detector response theorem. It asserts that for every measurement device, there are unique operators ##P_k## which determine the rates of response to every source with density operator ##\rho## according to the formula
$$
p_k=\langle P_k\rangle:=\tr\rho P_k.
$$
The ##P_k## form a discrete quantum measure; i.e., they are Hermitian, positive semidefinite and sum to the identity operator ##1##. This is the natural quantum generalization of a discrete probability measure. (In more abstract terms, a discrete quantum measure is a simple instance of a so-called POVM, but the latter notion is not needed for understanding the main message of the paper.)
Statistical expectations and quantum expectations
Thus a quantum measurement device is characterized formally by means of a discrete quantum measure. To go from detection events to measured numbers one needs to provide a scale that assigns to each detection element ##k## a real or complex number (or vector) ##a_k##. We call the combination of a measurement device with a scale a quantum detector. The statistical responses of a quantum detector define the statistical expectation
$$
\E(f(a_k)):=\sum_{k\in K} p_kf(a_k)
$$
of any function ##f(a_k)## of the scale values. As always in statistics, this statistical expectation is operationally approximated by finite sample means of ##f(a)##, where ##a## ranges over a sequence of actually measured values. However, the exact statistical expectation is an abstraction of this; it works with a nonoperational probabilistic limit of infinitely many measured values so that the replacement of relative sample frequencies by probabilities is justified. If we introduce the quantum expectation
$$
\langle A\rangle:=\tr\rho A
$$
of an operator ##A## and say that the detector measures the quantity
$$
A:=\sum_{k\in K} a_kP_k,
$$
it is easy to deduce from the main result the following version of Born’s rule (BR):
- The statistical expectation of the measurement results equals the quantum expectation of the measured quantity.
- The quantum expectations of the quantum measure constitute the probability measure characterizing the response.
This version of Born’s rule applies without idealization to results of arbitrary quantum measurements.
(In general, the density operator is not necessarily normalized to intensity ##1##; without this normalization, we call ##\langle A\rangle## the quantum value of ##A## since it does not satisfy all properties of an expectation.)
Projective measurements
The conventional version of Born’s rule – the traditional starting point relating quantum theory to measurement in terms of eigenvalues, found in all textbooks on quantum mechanics – is obtained by specializing the general result to the case of exact projective measurements. The spectral notions do not appear as postulated input as in traditional expositions, but as consequences of the derivation in a special case – the case where ##A## is a self-adjoint operator, hence has a spectral resolution with real eigenvalues ##a_k##, and the ##P_k## is the projection operators to the eigenspaces of ##A##. In this special case, we recover the traditional setting with all its ramifications together with its domain of validity. This sheds new light on the understanding of Born’s rule and eliminates the most problematic features of its uncritical use.
Many examples of realistic measurements are shown to be measurements according to the DRP but have no interpretation in terms of eigenvalues. For example, joint measurements of position and momentum with limited accuracy, essential for recording particle tracks in modern particle colliders, cannot be described in terms of projective measurements; Born’s rule in its pre-1970 forms (i.e., before POVMs were introduced to quantum mechanics) does not even have an idealized terminology for them. Thus the scope of the DRP is far broader than that of the traditional approach based on highly idealized projective measurements. The new setting also accounts for the fact that in many realistic experiments, the final measurement results are computed from raw observations, rather than being directly observed.
Operational definitions of quantum concepts
Based on the detector response theorem, one gets an operational meaning for quantum states, quantum detectors, quantum processes, and quantum instruments, using the corresponding versions of quantum tomography.
In quantum state tomography, one determines the state of a quantum system with a ##d##-dimensional Hilbert space by measuring sufficiently many quantum expectations and solving a subsequent least squares problem (or a more sophisticated optimization problm) for the ##d^2-1## unknowns of the state. Quantum tomography for quantum detectors, quantum processes, and quantum instruments proceed in a similar way.
These techniques serve as foundations for far-reaching derived principles; for quantum systems with a low-dimensional density matrix, they are also practically relevant for the characterization of sources, detectors, and filters. A quantum process also called a linear quantum filter, is formally described by a completely positive map. The operator sum expansion of completely positive maps forms the basis for the derivation of the dynamical laws of quantum mechanics – the quantum Liouville equation for density operators, the conservative time-dependent Schrödinger equation for pure states in a nonmixing medium, and the dissipative Lindblad equation for states in mixing media – by a continuum limit of a sequence of quantum filters. This derivation also reveals the conditions under which these laws are valid. An analysis of the oscillations of quantum values of states satisfying the Schrödinger equation produces the Rydberg-Ritz combination principle underlying spectroscopy, which marked the onset of modern quantum mechanics. It is shown that in quantum physics, normalized density operators play the role of phase space variables, in complete analogy to the classical phase space variables position and momentum. Observations with highly localized detectors naturally lead to the notion of quantum fields whose quantum values encode the local properties of the universe.
Thus the DRP leads naturally to all basic concepts and properties of modern quantum mechanics. It is also shown that quantum physics has a natural phase space structure where normalized density operators play the role of quantum phase space variables. The resulting quantum phase space carries a natural Poisson structure. Like the dynamical equations of conservative classical mechanics, the quantum Liouville equation has the form of Hamiltonian dynamics in a Poisson manifold; only the manifold is different.
Philosophical consequences
The new approach has significant philosophical consequences. When a source is stationary, response rates, probabilities, and hence quantum values, can be measured in principle with arbitrary accuracy, in a reproducible way. Thus they are operationally quantifiable, independent of an observer. This makes them objective properties, in the same sense as in classical mechanics, positions and momenta are objective properties. Thus quantum values are seen to be objective, reproducible elements of reality in the sense of the famous paper
- A. Einstein, B. Podolsky, and N. Rosen, Can quantum-mechanical description of physical reality be considered complete? Phys. Rev. 47 (1935), 777-781.
The assignment of states to stationary sources is as objective as any assignment of classical properties to macroscopic objects. In particular, probabilities appear – as in classical mechanics – only in the context of statistical measurements. Moreover, all probabilities are objective frequentist probabilities in the sense employed everywhere in experimental physics – classical and quantum. Like all measurements, probability measurements are of limited accuracy only, approximately measurable as observed relative frequencies.
Among all quantum systems, classical systems are characterized as those whose observable features can be correctly described by local equilibrium thermodynamics, as predicted by nonequilibrium statistical mechanics. This leads to a new perspective on the quantum measurement problem and connects to the thermal interpretation of quantum physics, discussed in detail in my 2019 book ‘Coherent Quantum Physics‘ (de Gruyter, Berlin 2019).
Conclusion
To summarize, the new approach gives an elementary, and self-contained deductive approach to quantum mechanics. A suggestive notion for what constitutes a quantum detector and for the behavior of its responses leads to a definition of measurement from which the modern apparatus of quantum mechanics can be derived in full generality. The statistical interpretation of quantum mechanics is not assumed, but the version of it that emerges is discussed in detail. The standard dynamical and spectral rules of introductory quantum mechanics are derived with little effort. At the same time, we find the conditions under which these standard rules are valid. A thorough, precise discussion is given of various quantitative aspects of uncertainty in quantum measurements. Normalized density operators play the role of quantum phase space variables, in complete analogy to the classical phase space variables position and momentum.
There are implications of the new approach for the foundations of quantum physics. By shifting the attention from the microscopic structure to the experimentally accessible macroscopic equipment (sources, detectors, filters, and instruments) we get rid of all potentially subjective elements of quantum theory. There are natural links to the thermal interpretation of quantum physics as defined in my book.
The new picture is simpler and more general than the traditional foundations, and closer to actual practice. This makes it suitable for introductory courses on quantum mechanics. Complex matrices are motivated from the start as a simplification of the mathematical description. Both conceptually and in terms of motivation, introducing the statistical interpretation of quantum mechanics through quantum measures is simpler than introducing it in terms of eigenvalues. To derive the most general form of Born’s rule from quantum measures one just needs simple linear algebra, whereas even to write down Born’s rule in the traditional eigenvalue form, unfamiliar stuff about wave functions, probability amplitudes, and spectral representations must be swallowed by the beginner – not to speak of the difficult notion of self-adjointness and associated proper boundary conditions, which is traditionally simply suppressed in introductory treatments.
Thus there is no longer an incentive for basing quantum physics on measurements in terms of eigenvalues – a special, highly idealized case – in place of the real thing.
Postscript
In the mean time I revised the paper. The new version new version is better structured and contains a new section on high precision quantum measurements, where the 12 digit accuracy determination of the gyromagnetic ration through the observation and analysis of a single electron in a Penning trap is discussed in some detail. The standard analysis assumes that the single electron is described by a time-dependent density operator following a differential equation. While in the original papers this involved arguments beyond the traditional (ensemble-based and knowledge-based) interpretations of quantum mechanics, the new tomography-based approach applies without difficulties.
Full Professor (Chair for Computational Mathematics) at the University of Vienna, Austria
Of course the eigenvalues of the spin component (magnetic moment, which is proportional to it) in direction of the magnetic field are involved, leading to the prediction of two strips on the screen.
"
But they are involved in the dynamics of the particles, not in their measurement. Thus their appearance is independent of the measurement itself (which only involves the screen) and no eigenvalues.
Of course the eigenvalues of the spin component (magnetic moment, which is proportional to it) in direction of the magnetic field are involved, leading to the prediction of two strips on the screen. In the standard description for an electron moving through an inhomogeneous magnetic field of the right kind it's simply that the magnetic fields leads to an entanglement between position and this spin component, i.e., an Ag-atom beam splits in two pieces which are pretty well separated, and in one beam are with almost 100% probability spin-up and in the other spin-down Ag-atoms. Blocking one beam is an almost perfect preparation for Ag-atoms with determined spin components.
The "eigenvalue stuff" however gives a straight-forward relation between the mathematical abstract object (self-adjoint operator on a Hilbert space) to physical quantities/observables: values you find when measuring the observable accurately.
"
Not really.
What is measured in a Stern-Gerlach experiment is simply the silver intensity on the screen. To interpret the latter as a spin measurement you need to invoke the quantum mechanical model of the whole setting – and this in the shut up and calculate version only. Eigenvalues or Born's rule are not involved here at all! So one doesn't expect a use for POVMs either.
I don't know, how you describe even a Stern-Gerlach experiment with a POVM
"
The experiment is described and explained just using shut up and (semiclassical) calculate! Probablilities are not needed, only beam intensities. Using either Born's rule and POVMs would be theoretical overkill.
Stern and Gerlach had neither POVMs nor Born's rule, and their experiment (including variations) can be interpreted without any probability arguments; the only information needed beyond semiclassical models is the fact that with low intensity beams one needs a longer exposure to produce the detailed pattern.
"
The math doesn't seem so much more difficult than the standard QT description.
"
In fact the math is both simpler and more general, and hence to be preferred over Born's rule. No eigenvalue stuff is needed.
"
The problem is that it's not clear how to make the connection with equipment in the lab, which is not a problem in the standard description at all.
"
One makes the connection with equipment without any reference to either POVM or Born's rule. Instead one refers to known properties of the equipment and the established relations to theory.
The physical idea behind POVMs is just the detector response principle discussed in Section 2.2 of my paper, together with the statement of Theorem 2.1. The proof is relevant only for those who want to understand the mathematical concept behind.
"
The math doesn't seem so much more difficult than the standard QT description. The problem is that it's not clear how to make the connection with equipment in the lab, which is not a problem in the standard description at all.
In the standard description you simply predict the probability for finding the silver atoms on the screen after having gone through the magnet and compare it to what's measured. The agreement is not great but satisfactory.
The high-accuracy version of measuring magnetic moments with a Penning trap, which we also have discussed above, is also described in such a way, but also in this case you didn't derive the POVM from this setup but just describe something with words. It's not better than the standard descriptions of experiments about quantum objects, and indeed you are right, mostly the measurement device is treated with classical physics, but that's not so surprising since measurement devices are macroscopic objects which are well described by classical physics. But that still doesn't answer my question, how to get a concrete POVM. Of course also the other direction would be interesting, i.e., how to design an experiment for a given POVM. But also this I've not seen yet anywhere.
I simply like to understand, what's behind this POVM idea in a physical context rather than a mathematical abstract concept.
"
The physical idea behind POVMs is just the detector response principle discussed in Section 2.2 of my paper, together with the statement of Theorem 2.1. The proof is relevant only for those who want to understand the mathematical concept behind.
We obviously have a different understanding what "concrete" means. I don't know, how experimental particle physicists program their computers, but I'm pretty sure, it's not based on the POVM paradigm of quantum measurement theory.
"
That's not necessary to apply the POVM principles. They also don't base it on the projective paradigm of quantum measurement theory that you favor.
Instead they rely on the claims of manufacturers or peers how the equipment works. Almost every experimental reasoning is completely classical, together with a little semiclassical quantum mechanics for some crucial points.
"
I don't blame you, but I simply like to understand, what's behind this POVM idea in a physical context rather than a mathematical abstract concept.
It seems to be very difficult to construct it for even a so much simpler setup as a measurement of "particle tracks" with a cloud chamber…
"
This is because your question is not adapted to how POVMs are actually used.
In practice, people (i) either have a given setup and want to calibrate it; so they do quantum tomography to find the POVM. Or (ii) they want to realize a given POVM with a suitable experiment. The latter is particularly relevant for potential applications in quantum computing.
In my paper, (i) is described in full detail and full generality in Section 2.2, and (ii) is described for a multiphoton setting in Section 4.1, and in more detail in the papers by Leonhardt cited there.
The exact POVM of concrete experiments is as complex as the experimental setting itself. But when one measures something one is not interested in these details unless ones want to construct a more accurate detector. Thus one usually idealizes the description to the bare minimum.
One can do the same for POVMs. For a joint measurement of position and momentum this is done in Section 4.2. The formula there is physically motivated and as simple as one can want it; the experimental realization is given in the paper cited there. For the partition of unity one can take any collection of hat functions describing the smearing, divided by their sum.
It seems to be very difficult to construct it for even a so much simpler setup as a measurement of "particle tracks" with a cloud chamber…
There are many words, no concrete construction of the POVM.
"
The words define in the first sentence a unique quantum measure. The relabeling needed to get the POVM is described in the remainder, and can be exactly described by the computer programs used to analyze the video (in your version of the experiment), summing the contributions that lead to the same label. Since you described the analysis of the video in words only, I cannot do better.
Thus the construction of the POVM is as concrete as your gedanken experiment.
You never gave a concrete answer. That's the problem.
Where is the concrete POVM given in this paper?
"
The very concrete answer is in the last paragraph of Section 4.4 on p.32. I had several times referred to it. The POVM consists of the quantum measure together with a reindexing of the matrices by the measurement results.
That's not true. You always got the answer that it can be done only if you specify the full measuring process from the interaction of the measured system till the measurement results. And then you lost patience or interest, and didn't follow up on my answers.
"
You never gave a concrete answer. That's the problem.
"
Mott does not perform a single measurement in his analysis. So how can one extract a POVM from his discussion if nothing is measured? The POVM would depend on details about how the cloud chamber track is observed to actually get the results of the measurement.
Now that you defined a recipe for getting a position and momentum that can be carried out experimentally, this is indeed possible. Your recipe leads to a POVM in essentially the same way as my analysis in Section 4.4 of v4 of my quantum tomography paper, except that the grid of wires is replaced by a grid of pixels encoding the video. That this POVM is complicated comes from the fact that extracting a position and a momentum from a video is complicated.
"
Were is the concrete POVM given in this paper?
I am assuming ##|D_H\rangle,|D_V\rangle## are members of ##\mathcal{H}_\mathrm{phonon}\otimes\mathcal{H}_\mathrm{detector}## but not ##\mathcal{H}_\mathrm{detector}##, such that e.g. ##U^\dagger|D_H\rangle = |H\rangle|D_0\rangle## but maybe this is too naive
"
This is not too naive but too sloppy to be correct. In fact you define
##|D_H\rangle :=U(|H\rangle|D_0\rangle)## and similarly ##|D_V\rangle##, and your formula follows.
I always get the answer that one cannot do that.
"
That's not true. You always got the answer that it can be done only if you specify the full measuring process from the interaction of the measured system till the measurement results. And then you lost patience or interest, and didn't follow up on my answers.
"
The most simple example for a unsharp joint measurement of position and momentum of a particle seems to be the example that is described (imho fully satisfactory) by Mott in the famous paper about the tracks of a charged particle in a cloud chamber.
"
Mott does not perform a single measurement in his analysis. So how can one extract a POVM from his discussion if nothing is measured? The POVM would depend on details about how the cloud chamber track is observed to actually get the results of the measurement.
"
One can extend this indeed to an observation of approximate positions and momenta by simply taking a movie and measuring the position of the end of the track as a function of time and then deduce both a position and the momentum of the particle along this track. Shouldn't it be possible to describe this (gedanken) experiment in terms of a POVM?
"
Now that you defined a recipe for getting a position and momentum that can be carried out experimentally, this is indeed possible. Your recipe leads to a POVM in essentially the same way as my analysis in Section 4.4 of v4 of my quantum tomography paper, except that the grid of wires is replaced by a grid of pixels encoding the video. That this POVM is complicated comes from the fact that extracting a position and a momentum from a video is complicated.
My challenge stands: The most simple example for a unsharp joint measurement of position and momentum of a particle seems to be the example that is described (imho fully satisfactory) by Mott in the famous paper about the tracks of a charged particle in a cloud chamber. One can extend this indeed to an observation of approximate positions and momenta by simply taking a movie and measuring the position of the end of the track as a function of time and then deduce both a position and the momentum of the particle along this track. Shouldn't it be possible to describe this (gedanken) experiment in terms of a POVM?
or each measurement on the system measures a different state of the system but never the projected one.
"
The typical example is a faint temporary interaction which changes the system state only slightly (not by projection) but leaves an irreversible record, hence counts as measurement. Particle track detectors are based on this.
Repeated, nondestructive measurements of the same quantity of the same system should yield the same result each time, I think. This isn't guaranteed for POVMs right? Is there a distinction between asserting a detector imperfectly measures a standard observable built from a projective decomposition, and a detector exactly measuring a quantity built from a POVM like the one above?
"
The distinction is the word 'projective'. Dirac and von Neumann considered only measurements whose repetition yield the same result each time, and were thus lead to the class of projective measurements.
But most measurements in practice are not of this kind, as either each realization of the system can be typically measured only once, or each measurement on the system measures a different state of the system but never the projected one.
the measurement of a "trajectory" of a single particle in a cloud chamber. You can put some radioactive material in there and make a movie of the tracks forming, i.e., you can measure directly how the track forms, i.e., a position-momentum joint measurement. The standard quantum description a la Mott is very clear and for me describes the appearance of "trajectories" in this setup satisfactory, but maybe it's interesting to discuss it within the POVM framework too?
"
The POVM description of this is similar to that of the joint position-momentum measurement of particle tracks in my Section 4.4, except that the arrangement of wires is replaced by the pixels of the video taken.
So my challenge is standing: How can the abstract POVM formalism be made applicable to describe a real-world experiment in the lab.
"
The whole of Section 4 of my paper is devoted to real-world experiments that use POVMs rather than projective measurement, with reference to other people's work.
"
I've never heard anybody using it to describe a real-world experiment yet.
"
This can only mean that you never bothered to read the associated literature. For example, the quantum information textbook by Nielsen and Chuang is full of POVMs.
"
Most introductions to quantum mechanics describe only projective measurements, and consequently the general description of measurements given in Postulate 3 may be unfamiliar to many physicists, as may the POVM formalism described in Section 2.2.6. The reason most physicists don’t learn the general measurement formalism is because most physical systems can only be measured in a very coarse manner. In quantum computation and quantum information we aim for an exquisite level of control over the measurements that may be done, and consequently it helps to use a more comprehensive formalism for the description of measurements. […]
A physicist trained in the use of projective measurements might ask to what end we start with the general formalism, Postulate 3? There are several reasons for doing so. First, mathematically general measurements are in some sense simpler than projective measurements, since they involve fewer restrictions on the measurement operators; there is, for example, no requirement for general measurements analogous to the condition ##P_iP_j = \delta_{ij}P_i## for projective measurements. This simpler structure also gives rise to many useful properties for general measurements that are not possessed by projective measurements. Second, it turns out that there are important problems in quantum computation and quantum information – such as the optimal way to distinguish a set of quantum states – the answer to which involves a general measurement, rather than a projective measurement. A third reason [… is …] the fact that many important measurements in quantum mechanics are not projective measurements.
"
Another (gedanken) experiment is the measurement of a "trajectory" of a single particle in a cloud chamber. You can put some radioactive material in there and make a movie of the tracks forming, i.e., you can measure directly how the track forms, i.e., a position-momentum joint measurement. The standard quantum description a la Mott is very clear and for me describes the appearance of "trajectories" in this setup satisfactory, but maybe it's interesting to discuss it within the POVM framework too?
what's problematic with the standard treatment in Brown and Gabrielse's RMP article.
"
What's problematic, especially for the ensemble interpretation (and Rigo et al. explicitly acknowledge that it goes beyond the standard treatment) is that they use a density operator to describe a single electron rather than an ensemble of electrons. This ensemble is purely imagined (sa, Brown and Gabrielse state explicitly) and has no physical reality. It is needed to derive the formula by which the gyromagnetic ratio is measured.
"
In the Penning-trap case aren't there the currents of the "mirror charges" in the trap electrodes observed?
"
Their paper says that they measured two particular frequencies (how doesn't really matter for my paper, but you can find more details by reading their paper yourself), whose quotient gives the gyromagnetic ratio to 12 decimal places.
"
In Sect. 9.4 you also don't construct a POVM explicitly for this standard Penning-trap setup.
"
This is because parameter determination such as that of the gyromagnetic ratio, and in fact most of spectroscopy, is not a quantum measurement in the sense of Born's rule nor is it one in in the sense of POVMs. But is uses the objective existence of the density operator and its dissipative dynamics, which are consequences of the detector response principle DRP on which the whole paper is based. That's why I added the material to the paper.
Note that my paper is not primarily about POVMs but about how quantum tomography explains quantum mechanics. Deriving the POVM formalism, including Born's rule where it applies is only a small part of the whole story.
I also don't understand, what's problematic with the standard treatment in Brown and Gabrielse's RMP article. It's just 1st-order perturbation theory in RPA approximation (in Sect. V.A).
The evaluation of experimental data never uses the quantum formalism. The interpretation of these data of an electron in a Penning trap, however, and thus the "mapping" of measured "beat frequencies" and their mapping to the value of ##(g-2)## is based on the QT formalism.
"
In the mean time I revised my tomography paper. The new version is better structured and contains a new section on high precision quantum measurements, where the 12 digit accuracy determination of the gyromagnetic ration through the observation and analysis of a single electron in a Penning trap is discussed in some detail.
The standard analysis assumes that the single electron is described by a time-dependent density operator following a differential equation. While in the original papers this involved arguments beyond the traditional (ensemble-based and knowledge-based) interpretations of quantum mechanics, the new tomography-based approach applies without difficulties.
Perhaps you touched upon this elsewhere.
From the perspective of inference: learning the hamiltonian is as much of a challenge as knowledge of the initial state, and in a real situation the two problems must interfere with each other. I think determining the state is referred to as the state tomography, but determining the hamiltonian is the process tomography?
"
Yes, in the special case where the system is conservative; otherwise no hamiltonian exists, only a system of Lindblad generators.
"
If one insists that knowledge of the state and the state of the unitary evolution that is applied to the state both qualify as "information", how would one realise the simultaneous process of process and state tomography?
"
This is called self-calibrating tomography. See p.38 of my paper, where I also give some references where further details can be found.
Do I miss something?
"
Probably not. Less formally a Von-Neumann observation is represented by disjoint positive valued operators Ei such that sum Ei = 1. A POVM is simply a generalisation that removes the need to be disjoint. It turns out Gleason's Theorem is much easier to prove for POVM's. In practice, they occur when for example you observe a system with a probe then observe the probe. See for example:
http://www.quantum.umb.edu/Jacobs/QMT/QMT_Chapter1.pdf
Thanks
Bill
Of course, the frequencies in spectroscopy are differences of energy levels and not directly related with Born's rule but rather to the quantum dynamics, usually derived by 1st-order time-dependent PT in the dipole approximation for (spontaneous and induced) photon emission, i.e., based on the dynamical laws and the meaning of the Hamiltonian.
"
I agree. No interpretation is needed for this; it predates Born's rule by at least a year.
But it shows that Born's rule doesn't explain quantum measurements of a spectroscopic nature. The latter includes all high precision determinations of constants of Nature such as gyrofactors, mass ratios, etc.
"
The transition probabilities, also obtained in this same calculation, are of course based on Born's rule.
"
I agree. But this is independent of the value of the frequencies, and only the latter are measured in the experiment under discussion.
I stop this discussion at this point.
The evaluation of experimental data never uses the quantum formalism. The interpretation of these data of an electron in a Penning trap, however, and thus the "mapping" of measured "beat frequencies" and their mapping to the value of ##(g-2)## is based on the QT formalism.
"
More precisely, on the quantum formalism without Born's rule.
"
This is of course not directly related to Born's rule
"
It is of course not related at all to it.
"
but to the evaluation of the energy eigenvalues of the electron in the trap (the "geonium"). Even the noisy signal should be predictable by QT
"
It is, again by the quantum formalism without Born's rule.
"
and that's then again based on Born's rule.
"
Only because you apply again your magic wand that turns every quantum calculation into an instance of Born's rule.
You measure it, e.g., by burning hydrogen and measure the wave lengths of the light using a grating.
"
But nothing is burnt in a Penning trap, which is the example under discussion.
"
For that you accumulate a lot of photons. The intensity of the different lines is predicted using Born's rule.
"
But the intensity of the lines gives no information at all about the energy differences.
The frequency of the photons emitted by the trapped electron, and hence the determination of the position of the resonance peaks from which the high precision gyrofactor is computed is independent of Born's rule.
Sure, but the Born rule is one of the basic postulates that's behind all these pragmatic approaches.
"
behind all these??
How is Born's rule behind the measurement of the energy difference of two levels of a quantum system?
So you finally agree that standard quantum theory involves more than Born's rule in order to relate the mathematical formalsim to experiment!.
Indeed, standard quantum physics has a most pragmatic approach to the interpretation of the formalism: Anything goes that gives agreement with experiment, and Born's rule is just a tool applicable in some situations, whereas other tools (such as resonance observations or POVMs) apply in other situations.
Can we agree on that?
"
Sure, but the Born rule is one of the basic postulates that's behind all these pragmatic approaches. In Newtonian mechanics there's also much more then the "three laws" to apply it to the analysis of the pgenomena, but they are behind all the corresponding methods.
This is similar to other "inverse problems". For example i neuroscience, how can you get statistics from a single neuron, when you put an electrode into neural tissue? Then one basic method is clustering, where while the collected signal is influenced by many nearby single neurons, each single neurons (like each quantum state) has it's own "signature", which by clustering one can classify spikes that originate from the same neuron. But that's not an exact mathematical inverse though, it's always theoretically possible that you fail to resolve two neurons that just happened to have very similar signatures. But once clustered, one collects statistics for single neurons.
"
Yes. The estimation of constants in mathematical models of reality (whether a growth parameter in a biological model or a gyrofactor in a model of a Penning trap) from noisy measurements is always an inverse problem.
It's on you to show that your bold claim that standard QT cannot be used to understand these experimental results explained usually by standard QT. Spectoscopy, i.e. the measurement of energy differences is the topic since day 1 of modern QT, described by standard QT (prediction of the frequencies and intensities of the em. radiation through transitions between atomic energy levels).
"
So you finally agree that standard quantum theory involves more than Born's rule in order to relate the mathematical formalism to experiment!.
Indeed, standard quantum physics has a most pragmatic approach to the interpretation of the formalism: Anything goes that gives agreement with experiment, and Born's rule is just a tool applicable in some situations, whereas other tools (such as resonance observations or POVMs) apply in other situations.
Can we agree on that?
All the experiments, leading to several Nobel prizes (Dehmelt, Wineland, Haroche,…), with single electrons, atoms, ions, etc. in traps has been analyzed in the standard way of quantum theory. The trace formula to calculate expectation values is a direct consequence of the probabilities predicted in the formalism of QT using Born's rule.
"
"
I still also don't see, why you think that collecting statistics by coupling a single quantum in a trap to a electrical circuit or repeated excitation-dexcitation events via the emitted photons, etc. cannot be understood with standard quantum theory although that's done for decades. Indeed, the many excitation-relaxation processes via an external laser field is defining the ensemble in this example.
"
"
Another example with a single electron in a Penning trap is to measure the currents of the "mirror charges" in response to the motion of the electron in the trap
"
"
Which quantum observables of the electrons are measured by these currents? If Born's rule were involved, you should be able to point to the operators to which Born's rule is applied in this case.
"
"
You measure the energy differences via repeated spin flips of the electron in the trap via repeated two-photon excitations through a coherent rf field + thermal excitation (Fig. 5 +6 of Dehmelt's paper).
"
"
This measurement recipe is not covered by Born's rule since there is no operator on the electron Hilbert space whose eigenvalues are the energy differences.
So how do you think Born's rule applies in this case?
"
"
Well, perhaps you should read the paper more carefully (or the relevant original papers quoted in that review). Here it's Ref. [18]:
https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.38.310
"
I have my own interpretation of what is going on, and it does not involve Born's rule.
But you claimed that the experiment is (like all experiments with ion traps) explained by Born's rule. For your convenience and those of the other readers I collected the whole train of your arguments.
I am challenging you to provide a proof of your claim in this particular instance. If you can't do it in the simple case of measuring energy differences, your claim is without any substance!
https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.38.310
You measure the energy differences via repeated spin flips of the electron in the trap via repeated two-photon excitations through a coherent rf field + thermal excitation (Fig. 5 +6 of Dehmelt's paper).
"
This measurement recipe is not covered by Born's rule since there is no operator on the electron Hilbert space whose eigenvalues are the energy differences.
So how do you think Born's rule applies in this case?
Then you use one photon ("idler") to "herald" the other photon ("signal"), which you then use for experiments. This gives an ensemble of identically prepared single photons.
"
I agree. In this version nothing needs to be explained.
I was thinking of potential applications in quantum information processing, where the situation is different.
"
Another example with a single electron in a Penning trap is to measure the currents of the "mirror charges" in response to the motion of the electron in the trap
"
Which quantum observables of the electrons are measured by these currents? If Born's rule were involved, you should be able to point to the operators to which Born's rule is applied in this case.
Usually one uses heralded photons to prepare single-photon states, which are in fact not so easy to produce (in contradistinction to "dimmed down coherent states", which however are not equivalent to true single-photon states but consist largely of the vacuum state). One way, nowadays kind of standard, is to shine with a laser on a BBO crystal and use the entangled photon pairs from parametric down conversion. Then you use one photon ("idler") to "herald" the other photon ("signal"), which you then use for experiments. This gives an ensemble of identically prepared single photons.
In the experiments with single atoms in a trap you usually use an external em. field to excite these atoms many times an measure the emitted photons. Another example with a single electron in a Penning trap is to measure the currents of the "mirror charges" in response to the motion of the electron in the trap, which also provides the statistics you need (see Dehmelt's or Brown's review papers quoted above).
All the experiments, leading to several Nobel prizes (Dehmelt, Wineland, Haroche,…), with single electrons, atoms, ions, etc. in traps have been analyzed in the standard way of quantum theory.
"
There is no standard way beyond pragmatism (anything successful goes) to do the matching of formalism to complex experiments.
From
"
My 'orthodoxy' is not identical to that of Bohr, nor to that of Peierls, to mention two especially eminent examples. Hence I must state my definition of 'orthodoxy'.
"
From
"
Orthodox QM, I am suggesting, consists of shifting between two different ways of understanding the quantum state according to context: interpreting quantum mechanics realistically in contexts where interference matters, and probabilistically in contexts where it does not. Obviously this is conceptually unsatisfactory (at least on any remotely realist construal of QM) — it is more a description of a practice than it is a stable interpretation. […] The ad hoc, opportunistic approach that physics takes to the interpretation of the quantum state, and the lack, in physical practice, of a clear and unequivocal understanding of the state — this is the quantum measurement problem.
"
"
The realization of "weak measurements" and the description with the more general concept of POVMs is pretty recent, and as far as I can see, it's not something contradicting the fundamental Born postulate
"
I never claimed a contradiction, just a non-applicability. One cannot derive from a postulate that only applies to large ensembles of independent and identically prepared systems any statement about a single system!
"
the many excitation-relaxation processes via an external laser field is defining the ensemble in this example.
"
If the processes are carried out identically, this indeed gives an ensemble of identically prepared photons. But if one only sends a handful of photons on demand to transmit a message (the primary reason why one would want to produce them on demand), one only gets an ensemble of not-identically prepared photons!
"
How else should you get statistics with a single quantum?
"
Through repeated measurements, with stochasticity induced by the unmodelled interaction with the environment. Just like in classical stochastic processes!
I forgot to give the link:
"
Yes, and I don't see anything contradicting the standard way to relate the formalism of QED to observations. I also think that the idea that ##|\psi(t,\vec{x})|^2## refers to some kind of intensity in Schrödingers first interpretation of the wave function was in analogy to the intensity of light, where it was known to be measured in terms of the energy density. This was however very soon be realized not to be in accordance with the detection of particles (particularly electrons) which indeed leave a single point on a photo plate and not a smeared distribution, and this brought Born to his probabilistic interpretation (in a footnote of his paper on scattering theory of 1926). Today we can use QED to derive that for the em. field the detection probability is indeed proportional to the expectation value of the energy density: It's just following from the first-order perturbation theory and the dipole approximation to describe the photo effect. The formula to evaluate these expectation values is of course based on Born's rule (or postulate). That's all in the standard textbooks about quantum optics and used also in the papers referring to experiments with single photons and/or entangled photon pairs, including all kinds of Bell tests, entanglement swapping, teleportation, and all that.
I still also don't see, why you think that collecting statistics by coupling a single quantum in a trap to a electrical circuit or repeated excitation-dexcitation events via the emitted photons, etc. cannot be understood with standard quantum theory although that's done for decades. Indeed, the many excitation-relaxation processes via an external laser field is defining the ensemble in this example. How else should you get statistics with a single quantum?
The realization of "weak measurements" and the description with the more general concept of POVMs is pretty recent, and as far as I can see, it's not something contradicting the fundamental Born postulate, how QT probabilities and expectation values are related to the formalism (statistical operators to represent the state and self-adjoint (or unitary) operators for observables).
It's simply not true! As shown in the book by Peres in a very clear way the Born rule is underlying also the more general cases of POVMs.
"
Yes, but he assumes everywhere stationary sources, i.e., ensembles of identically prepared systems. Moreover, he assumes unphysical mathematical constructs called ancillas to reduce POVM measurements on these ensembles to Born's rule.
"
All the experiments, leading to several Nobel prizes (Dehmelt, Wineland, Haroche,…), with single electrons, atoms, ions, etc. in traps have been analyzed in the standard way of quantum theory.
"
They use for their analysis a pragmatic approach (i.e., whatever gives agreement with experiments serves as interpretation), not one strictly based on Born's rule. The latter has essential restrictions to apply!
"
Once more the citation of Peres's book:
I don't know, whether he uses the phrase "weak measurement", but he discusses POVMs and gives a very concise description of what's predicted by QT. It seems to be very much along the lines you propose in your paper (as far as I think I understand it).
"
Yes, he discusses POVM in the usual, very abstract terms. But everywhere he assumes stationary sources, i.e., ensembles of identically prepared systems. Under this condition he gets the same as what I propose (with different assumptions, not assuming Born's rule).
Peres never discusses single quantum systems and does not use the term "weak measurement". In the Wikipedia reference I cited, the (standard) derivation of the quantum trajectories describing weak measurements only tells what the state is after a sequence of POVM measurements and what is the probability distribution for getting the whole sequence of results. To give meaning to this probability distribution via Born's rule one needs an ensemble of identically prepared systems giving an ensemble of sequences of measurement results! Otherwise one has only a single sequence of measurement results and the probability of getting this single one is 100%!
As we had discussed some years ago, Peres noticed (and does not resolve) this conflict when he enters philosophical discussions in the last chapter of his book (if I recall correctly, don't have the book at hand).
I discussed a different single photon scenario, that of ''photons on demand'', in a lecture given some time ago.
"
I forgot to give the link:
Once more the citation of Peres's book:
A. Peres, Quantum Theory: Concepts and Methods, Kluwer
Academic Publishers, New York, Boston, Dordrecht, London,
Moscow (2002).
I don't know, whether he uses the phrase "weak measurement", but he discusses POVMs and gives a very concise description of what's predicted by QT. It seems to be very much along the lines you propose in your paper (as far as I think I understand it).
I was talking about the textbook by Peres, quoted in the posting you quote.
"
Please give a page number. If I remember correctly, Peres never mentions the notion of weak measurement. A search in scholar.google.com for
gives no hits at all.
I do not conflate the two. I'm talking about the meaning of the formalism, and that's probabilistic via Born's rule. All concepts related with the statistical meaning of the formalism are derived from Born's rule, including the trace rule for expectation values of observables.
"
I am also talking about the meaning of the formalism, but using more careful language. I do this without invoking Born's rule, which you take to be a blanket phrase covering everything probabilistic, independent of its origin. This blurs the conceptual distinctions and makes it impossible to discuss details with you.
"
Of course in measurements there is no Hilbert space, no operators, no trace rule, no Born's rule.
"
In the mathematical formalism there is also no Born's rule, but only the trace rule defining quantum expectations. Born's rule only relates the trace rule to measurements, and it does so only in special cases – namely when measurements are made on independent and identically prepared ensembles.
As long as there are no measurements – and this includes everything in books on quantum mechanics or quantum field theory when they derive formulas for scattering amplitudes or N-point functions -, everything is independent of Born's rule. The formula ##\langle A\rangle:=##Tr##\rho A## is just a definition of the meaning of the string on the left in terms of that on the right. It has a priori nothing to do with measurement, and hence with Born's rule.
But it seems to me that you simply equate Born's rule with the trace rule, independent of its relation to measurement. Equating this makes trivially everything dependent on Born's rule. But this makes Born's rule vacuous, and its application to measurements invalid in contexts where no ensemble of independent and identically prepared ensembles. exist.
What's new is the order of presentation, i.e., it is starting from the most general case of "weak measurements" (described by POVMs)
"
Could you please point out to which paper (and which page) you refer here? I found no mention of weak measurements or POVMs in the geonium paper by Brown and Gabrielse that you mentioned earlier. The latter is quite interesting but very long, so it takes a lot of time to digest the details. I'll comment on it in due time in a new thread.
"
Maybe it would help, when a concrete measurement is discussed, e.g., the nowadays standard experiment with single ("heralded") photons (e.g., produced with parametric down conversion using a laser and a BBO crystal, using the idler photon as the "herald" and then doing experiments with the signal photon).
"
I discussed a different single photon scenario, that of ''photons on demand'', in a lecture given some time ago:
Of course in measurements there is no Hilbert space, no operators, no trace rule, no Born's rule. You just measure observables and evaluate the statistics of their outcomes, take into account the specifics of the apparatus etc. There is no generally valid formalism for this but it has to be analyzed for any experimental setup. That's not what I'm discussing and it's not related to the interpretation of QT.
I give up obviously I'm unable to understand your point of view.
"
Is it so difficult to understand that
Once you can accept that one can make this difference, you'll be able to understand everything I said. And you'll benefit a lot from this understanding!
To get expectation values you need the probabilities/probability distributions, which are given by Born's rule in the formalism.
"
No.
Point 3 is a mathematically precise version of your statement that a state is given by an equivalence class of identically prepared systems.
"
That interpretation of the state, ##\hat{\rho}##, leads immediately to ##\langle A \rangle=\mathrm{Tr}(\hat{\rho} \hat{A})##. For me all that is subsumed under "Born's rule". Instead of saying "Born's rule" I also could say "the probabilistic interpretation of ##\hat{\rho}##", but that's very unusual among physicists.
"
These are your magic wand and your magic spell, with which everything done in quantum mechanics looks as being based on Born's rule.
But your magic ignores the assumptions in Born's rule, hence is like concluding ##1=2## from ##x=2x## by division through ##x## without checking the assumption ##x\ne 0##.
"
If Born's rule were not applicable here, the experimental results couldn't be understood with standard QT, but they obviously are!
"
They are understandable with the pragmatic use of the quantum formalism that uses whatever interpretation explains an experiment. They are not understandable in terms of only Born's rule, since In experiments with single quantum systems, the assumption in Born's rule cannot be satisfied.
"
How then can it be that these results are very accurately described by Q(F)T, which uses Born's rule to predict this value of (g-2)?
"
These results are very accurately described by Q(F)T, which uses only mathematics (and not Born's rule) to predict this value of g-2. QED predicts the correct value of g-2 from the QED action purely by mathematical calculations, without any reference to measurement. Hence one has nowhere an opportunity to use Born's rule, since the latter only says something about quantum observables measured by means of averaging over measurement results obtained from independent and identically prepared.
Born's rule would however be needed to interpret probabilities measured from scattering experiments, for which Weinberg correctly invokes Born's rule. This is a typical case where the assumption present in Born's rule is satisfied.
Though not interpretable in terms of Born's rule or POVMs, such processes are able to describe single time-dependent quantum systems, just as classical stochastic process are able to describe single time-dependent classical systems.
"
How then can it be that these results are very accurately described by Q(F)T, which uses Born's rule to predict this value of (g-2)?
There I discuss the case of nonstationary quantum systems.
Please do not confuse contradictions and non-applicability! These are two very different things!
"
If Born's rule were not applicable here, the experimental results couldn't be understood with standard QT, but they obviously are!
Born's rule is not just taking averages of anything!
I use quantum expectations all the time, but Born's rule only when I interpret a quantum expectation in terms of measuring independent and identical prepared systems – which is a necessary requirement for Born's rule to hold.
How do you define the experimental meaning of ##\langle A\rangle## when ##A## is not normal, which is often the case in QFT?
"
To get expectation values you need the probabilities/probability distributions, which are given by Born's rule in the formalism. That interpretation of the state, ##\hat{\rho}##, leads immediately to ##\langle A \rangle=\mathrm{Tr}(\hat{\rho} \hat{A})##. For me all that is subsumed under "Born's rule". Instead of saying "Born's rule" I also could say "the probabilistic interpretation of ##\hat{\rho}##", but that's very unusual among physicists.
New Measurement of the Electron Magnetic Moment and the Fine Structure Constant
"A measurement using a one-electron quantum cyclotron gives the electron magnetic moment in Bohr magnetons, g/2 = 1.001 159 652 180 73 (28) [0.28 ppt], with an uncertainty 2.7 and 15 times smaller than for previous measurements in 2006 and 1987."
— https://arxiv.org/abs/0801.1134
"
"
How then can it be that, e.g., the measurement of the gyrofactor of the electron using a Penning trap is as precise as it is?
"
The measurement of the gyrofactor of the electron using a Penning trap is as precise as it is
because certain experimental situations happen to have very accurate descriptions in terms of a few-parameter quantum stochastic process, and the gyrofactor is one of these parameters.
Though not interpretable in terms of Born's rule or POVMs, such processes are able to describe single time-dependent quantum systems, just as classical stochastic process are able to describe single time-dependent classical systems.
The facts that there are only very few parameters and that one can measure arbitrarily long time series imply that one can use statistical parameter estimation techniques to find the parameters to arbitrary accuracy. The fact that the models are accurate imply that the parameters found for the gyrofactor accurately represent the gyrofactor.
I am now reading the papers you and Fra cited and will give details once I have digested them.
don't understand, what the content of Sect. 4.5 has to do with our discussion.
"
There I discuss the case of nonstationary quantum systems.
"
then pointing out where, in the view of the author, this contradicts the standard statistical interpretation a la Born.
"
Please do not confuse contradictions and non-applicability! These are two very different things!
You use yourself Born's rule all the time since everything is based on taking averages of all kinds defined by ##\langle A \rangle=\mathrm{Tr} \hat{\rho} \hat{A}## (if you use normalized ##\hat{\rho}##'s).
"
Born's rule is not just taking averages of anything!
I use quantum expectations all the time, but Born's rule only when I interpret a quantum expectation in terms of measuring independent and identical prepared systems – which is a necessary requirement for Born's rule to hold.
How do you define the experimental meaning of ##\langle A\rangle## when ##A## is not normal, which is often the case in QFT?
IF the precessing electron is "stationary enough", if they are able to keep a single electron precessing for a month?
"
Electrons in accelerators come in large bunches, not a single electrons….
"
What is "stationary or not", is I think also relative. Ie. relative to the speed of information processing of the observer.
"
It is only relative to the speed and accuracy with which reliable measurements can be taken. This is independent of any information processing on the side of the agent.
My understanding of the paper is that it is very close to the view as provided, e.g., by Asher Peres in his book
A. Peres, Quantum Theory: Concepts and Methods, Kluwer
Academic Publishers, New York, Boston, Dordrecht, London,
Moscow (2002).
What's new is the order of presentation, i.e., it is starting from the most general case of "weak measurements" (described by POVMs) and then brings the standard-textbook notion of idealized von Neumann filter measurements as a special case, and this makes a lot of sense, if you are aiming at a deductive (or even axiomatic) formulation of QT. The only problem seems to be that this view is not what the author wants to express, and I have no idea what the intended understanding is.
Maybe it would help, when a concrete measurement is discussed, e.g., the nowadays standard experiment with single ("heralded") photons (e.g., produced with parametric down conversion using a laser and a BBO crystal, using the idler photon as the "herald" and then doing experiments with the signal photon). In my understanding such a "preparation procedure" determines the state, i.e., the statistical operator in the formalism. Then one can do an experiment, e.g., a Mach-Zender interferometer with polarizers, phase shifters etc. in the two arms and then you have photon detectors to do single-photon measurements. It should be possible to describe such a scenario completely with the formalism proposed in the paper and then pointing out where, in the view of the author, this contradicts the standard statistical interpretation a la Born.
I don't understand, what the content of Sect. 4.5 has to do with our discussion. I don't see, how you can come to the conclusion that the "pragmatic use" of the formalism contradicts the Born rule as the foundation.
"
I didn't claim a contradiction with, I claimed the nonapplicability of Born's rule. These are two very different claims.
"
all these pragmatic uses are based on the probabilistic interpretation of the state a la Born.
"
You seem to follow the magic interpretation of quantum mechanics. Whenever you see statistics on measurements done on a quantum system you cast the magic spell "Born's probability interpretation", and whenever you see a calculation involving quantum expectations you wave a magic wand and say "ah, an application of Born's rule". In this way you pave your way through every paper on quantum physics and say with satisfaction at the end, "This paper proves again what I knew for a long time, that the interpretation of quantum mechanics is solely based on the probabilistic interpretation of the state a la Born".
You simply cannot see the difference between the two statements
The first statement is Born's rule, in the generalized form discussed in my paper.
The second statement (which you repeatedly employed in your argumentation) is an invalid generalization, since the essential hypothesis is missing under which the statement holds. Whenever one invokes Born's rule without having checked that the ensemble involved is actually independent and identically prepared, one commits a serious scientific error.
It is an error of the same kind as to conclude from x=2x through division by x that 1=2, because the assumption necessary for the argument was ignored.
"
Also, as I said before, I don't understand how you can say that with a non-stationary source no accuracy is reachable, while the quoted Penning-trap experiments lead to results which are among the most accurate measurements of quantities like the gyro-factor of electrons or, just recently reported even in the popular press, the accurate measurement of the charge-mass ratio of the antiproton.
"
This is not a contradiction since both the gyro-factor of electrons and the charge-mass ratio of the antiproton are not observables in the traditional quantum mechanical sense but constants of Nature.
A constant is stationary and can in principle be arbitrarily well measured, while the arbitrarily accurate measurement of the state of a nonstationary system is in principle impossible. This holds already in classical mechanics, and there is no reason why less predictable quantum mechanical systems should behave otherwise.
"
Nowhere in your paper I can see, that there is anything NOT based on Born's rule, although you use the generalization to POVMS, but I don't see that this extension is in contradiction to Born's rule. Rather, it's based on it.
"
This is because of your magic practices in conjunction with mixing up "contradition to" and "not applicable". Both prevent you from seeing what everyone else can see.
Nowhere in your paper I can see, that there is anything NOT based on Born's rule, although you use the generalization to POVMS, but I don't see that this extension is in contradiction to Born's rule. Rather, it's based on it.
This would imply that you cannot describe the results about a particle in a Penning trap with standard quantum theory,
"
This statement is indeed true if you restrict standard quantum theory to mean the formal apparatus plus Born's rule in von Neumann's form. Already the Stern-Gerlach experiment discussed above is a counterexample.
"
but obviously that's successfully done for decades!
"
This is because standard quantum theory was never restricted to a particular interpretation of the formalism. Physicists advancing the scope of applicability of quantum theory were always pragmatic and used whatever they found suitable to match the mathematical quantum formalism to particular experimental situations. This – and not what the introductory textbooks tell – was and is the only relevant criterion for the interpretation of quantum mechanics. The textbook version is only a simplified a posteriori rationalization.
This pragmatic approach worked long ago for the Stern-Gerlach experiment. The same pragmatic stance also works since decades for the quantum jump and quantum diffusion approaches to nonstationary individual quantum systems, to the extent of leading to a Nobel prize. They simply need more flexibility in the interpretation than Born's rule offers. What is needed is discussed in Section 4.5 of my paper.