Are there signs that any Quantum Interpretation can be proved or disproved?

In summary, according to the experts, decoherence has not made much progress in improving our understanding of the measurement problem.
  • #281
vanhees71 said:
That doesn't mean that there is not the usual meaning of probability concerning the observables on this one atom/photon.
They interpret the results for one single atom. The only statistics they make is about the time series produced by this atom, and they draw conclusions for this atom.
vanhees71 said:
I can use one and the same dice and throw it repeatedly to get the probability for the outcomes.
But only if you cast the die in a random way, so that the eyes are independently distributed. But this is not the case in a continuous quantum measurement. The latter means that you measure the die many times while it falls, and stop the experiment after the die is at rest. Or that after you cast the first die you lift it carefully and put it down again to get the next value for the eyes. In both cases the probability for the outcome becomes meaningless.
vanhees71 said:
Since the predictions of QT are probabilistic you have to do that to be able to gain "enough statistics" to compare your probabilistic predictions with the statistics of the measurement outcomes.
Some predictions are probabilistic, some are not. If you have a single atom in a trap then the raw observations are noisy but not independent realizations of the same quantity; thus the minimal interpretation does not say anything meaningful. Also, the observations depend on the controls applied to the atom - just as when maniulating a die by hand during the measurements.

The observations are therefore not given by Born's rule but by the rules for an externally controlled quantum stochastic process. To be able to do this in a correctly predicted way was worth a Nobel price.
 
Physics news on Phys.org
  • #282
Sure, but they use the "time series" (as you call it) to gain statistics. I don't see, how this contradicts the very general foundations of QT as formulated by our standard postulates only because I use a single atom to perform the same measurement repeatedly. Since atoms are exactly identical it doesn't make any difference whether I repeat the same experiment on the one individual atom or on several different identical atoms (all this of course stated within standard QT ;-)).
 
  • #283
vanhees71 said:
Sure, but they use the "time series" (as you call it) to gain statistics. I don't see, how this contradicts the very general foundations of QT as formulated by our standard postulates only because I use a single atom to perform the same measurement repeatedly.
Because in order that Born's rule is instrumentally meaningful you need identically prepared systems. Even for classical statistics, you need many instances to get meaningful frequentist (hence scientifically well-posed) probabilities.
vanhees71 said:
Since atoms are exactly identical it doesn't make any difference whether I repeat the same experiment on the one individual atom or on several different identical atoms.
To prepare a quantum system you always need to make it distinguishable from other, identical systems that are not prepared. Indeed, the atom in an ion trap is not indistinguishable, but is distinguished by the property of being in the trap, which has only place for one atom.

The point is that at each time measured, the ion is in a (at least potentially) different state, so a notion of 'identically prepared' cannot be applied. Though all atoms with the same number of protons, neutrons, and electrons identical in the sense of Bose statistics, the atoms are not identically prepared! The one you single out in an ion trap is very differently prepared from one in an ideal gas.
 
  • Like
Likes gentzen, mattt, dextercioby and 1 other person
  • #284
But in this experiment there are many identically prepared systems using one and the same molecules in a trap. I don't see, why an ensemble shouldn't be realized with one and the same system. That has nothing to do with quantum mechanics. You have the same in classical statistics. If you take a gas in a container it also consists of the same molecules all the time and the thermodynamical quantities (temperature, density, pressure,...) are understood as averages over many collisions of these same molecules.

I also don't understand why you say the ion is not prepared. To the contrary it's pretty sharply prepared being trapped in the trap. The laser exciting it is of course also part of the preparation procedure. Of course the single ion in the trap is not prepared as a thermal state here. I never claimed this.
 
  • #285
vanhees71 said:
That has nothing to do with quantum mechanics.
Exactly. And the irony is that A. Neumaier never denied this. One of his points is that the thermal interpretation solves this issue both for classical thermodynamics and for quantum mechanics. And his attempts to bring this point across to you is what helped me to finally get it. As I wrote: "Your current discussions with A. Neumaier helped me to see more clearly how the thermal interpretations resolves paradoxes I had thought about long before I heard about the thermal interpretation."
 
  • #286
Maybe then you can explain to me what the physical content of this interpretation is. I still don't get it from Neumaier's explanations, which are partially self-contradictory: One time he abandons the standard probabilstic meaning of his "q-expectation values" and then I'm lost, because then there's no physical meaning of the formalism left. Then he tells me again that it's still the same probabilistic meaning as in standard minimally interpreted QT, but then I don't see where's the difference between his and the standard QT interpretation.

Then there is the issue with doing experiments with a single ion in a trap. Neumaier seems to believe these cannot be describes within the standard minimal interpretation, but that's not right, because many people in this community of physicists work well with the standard QT and indeed what's measured here are probabilities or expectation values over many realizations of the experiment. That you use one and the same ion to realize these ensembles is no issue at all.
 
  • Like
Likes WernerQH
  • #287
vanhees71 said:
Then he tells me again that it's still the same probabilistic meaning as in standard minimally interpreted QT, but then I don't see where's the difference between his and the standard QT interpretation.
For the cases where the standard minimally interpreted QT applies, there is no significant difference between his and the standard QT interpretation.

vanhees71 said:
One time he abandons the standard probabilstic meaning of his "q-expectation values"
The name "q-expectation value" is reserved for the value computed by the model from the specific formula. The interpretation of those values is done separately. One reason for this is that not all values that the model can compute by such formulas will have a direct operational meaning in the real world.

vanhees71 said:
Maybe then you can explain to me what the physical content of this interpretation is.
Well, I wrote an explanation, but have now copied it away for the moment. I am not sure whether A. Neumaier would be happy if I tried, because he has written nicely polished articles and a nicely polished book where he explains it. Any explanation in a few words is bound to misrepresent his views, and additionally I should only speak for myself.

Let me instead remark how I see its relation to QBism: There you have talk about "agents", but what is an agent, and why should you care? In the thermal interpretation, there are no agents, but models are taken seriously on their own. In QBism, the agent uses QM as a cookbook to update his state. A model on the other hand naturally has a state, and doesn't need a cookbook to update it, the consistent evaluation of the state at arbitrary places in space and time is exactly what a model is all about.

But how do engineers and scientists use models to make predictions about the real world? Good question! Try to closely watch what they actually do, and try to not be mislead by their words about what they believe that they do!
 
  • #288
Ok, then I have to give up. I also don't understand Qbism as a physical interpretation of QT. I also don't see, where standard minimally interpreted QT should not apply (except for the unsolved problem of "quantum gravity", but that's not under debate here). If I watch what experimentalists do when they use QT, it's always within the standard probabilistic meaning of the quantum state.
 
  • #289
Isn't it odd that a theory that is almost 100 years old triggers such debates between two people who know it extremely well? It seems to disprove the idea that there is "no problem at all". The meaning of probability is being discussed to this day. In my opinion probability theory, just like geometry, is an indispensable ingredient of modern physical theories.

Is it necessary to emphasize that an ensemble has properties different from those of its members? An ensemble (average) can evolve smoothly and deterministically, but this need not be true for its members.

The purpose of an ensemble is to permit the statistical description of its members. And it is here where (I think) the deficiency of the statistical interpretation lies: It is too vague on what quantum theory is about, which properties the members of the ensembles have or do not have. It is not adequate to talk about quantum "objects" with conflicting properties, or properties that do not exist at all times.

An ensemble need not be physical. It doesn't need to have as many members as there are molecules in a volume of gas. As Willard Gibbs has shown, it is sufficient for our calculations that we can imagine it.
 
  • Like
Likes gentzen and vanhees71
  • #290
WernerQH said:
The purpose of an ensemble is to permit the statistical description of its members. And it is here where (I think) the deficiency of the statistical interpretation lies: It is too vague on what quantum theory is about, which properties the members of the ensembles have or do not have. It is not adequate to talk about quantum "objects" with conflicting properties, or properties that do not exist at all times.
The problem is all this philosophical ballast put on QT by Bohr et al. Too much philosophy hides the physics. According to quantum theory the properties of an are described by the quantum state, represented by the statistical operator. There is nothing conflicting here. It uniquely tells you the probabities to find one of the possible values for any observable when you measure them. Observables take determined values if and only if the system is prepared in a corresponding state, for which with 100% probability these observables take one of their possible values. The formalism also implies that generally it is impossible to prepare the system in a state where all observable take determined values.
 
  • #291
vanhees71 said:
According to quantum theory the properties of an are described by the quantum state, represented by the statistical operator. There is nothing conflicting here.
The formalism is perfect. But I do wonder what properties you were referring to ("properties of an ..."?). Saying that quantum theory is about observables sounds empty to me. Almost like "Classical Mechanics is about differential equations."
 
  • #292
Classical mechanics is about observables too of course. As the name suggests: The state describes the system's properties unambigously in both classical and quantum mechanics. Only the physical meaning of the (pure) states differs drastically.

In classical mechanics a pure state is a point in phase space. Specifying the point in phase space exactly at time ##t_0## implies that you know the exact point in phase space at any later time, and this implies that you know the precise values of all possible observables at any time ##t>t_0## (assuming you can exactly solve the equations of motion).

In quantum mechanics a pure state is represented by a statistical operator ##\hat{\rho}##, that is a projection operator ##\hat{\rho}^2=\hat{\rho}##, which means that there's a normalized state vector ##|\psi \rangle## such that ##\hat{\rho}=|\psi \langle \langle \psi|##. You can consider it determined by a filter measurement of a complete set of compatible observables. This is the most complete preparation possible for the quantum system according to standard quantum theory, but all it implies concerning any observable is the probability for the outcome of a precise measurement, given by Born's rule, i.e., if you measure an observable ##O## the probability to find one of the possible values ##o## (the ##o## is in the spectrum of the self-adjoint operator ##\hat{O}## representing ##O##) and ##|o,\alpha \rangle## is a complete orthonormal set of the eigenspace ##\mathrm{Eig}(\hat{O},o)##, then
$$P(o)=\sum_{\alpha} \langle o,\alpha|\hat{\rho}|o,\alpha \rangle.$$
This is the only meaning of the formalism: It predicts probabilities for the outcome of measurements of any observable of the system given the preparation of the system, even if the preparation is as complete as it can be according to QT.

Again the dynamical state evolution is deterministic, i.e., given ##\hat{\rho}(t_0)## the state is defined at any later time ##t## by solving the von Neumann equation (or for pure states the Schrödinger equation with the given Hamiltonian) (that's describing the Schrödinger picture to keep the formulation simple; the same holds of course in any other picture of time evolution, but there the eigenstates evolve with time too; the physical results, i.e., the probabilities are independent of the choice of the picture).
 
  • #293
Thanks, I'm familiar with all this.
vanhees71 said:
According to quantum theory the properties of an [?] are described by the quantum state, represented by the statistical operator.
You probably meant to write properties of a system. John Bell argued that a word like "system" (just like "apparatus" or "measurement") should have no place in a rigorous formulation of quantum theory ("Against Measurement").
 
  • #294
vanhees71 said:
The problem is all this philosophical ballast put on QT by Bohr et al. Too much philosophy hides the physics.
It seems that others didn't think like that. For example, John Bell in „BERTLMANN’S SOCKS AND THE NATURE OF REALITY“ (Journal de Physique Colloques, 1981, 42 (C2), pp.C2-41-C2-62):

Fourthly and finally, it may be that Bohr's intuition was right - in that there is no reality below some 'classical' 'macroscopic' level. Then fundamental physical theory would remain fundamentally vague, until concepts like 'macroscopic' could be made sharper than they are today.
 
  • #295
Fra said:
By a similar argument one could argue that the detailed hamiltonian for such system + macroscopic detector is in principle not inferrable by an observer?
physicsworks said:
Why? An observer records particular values of collective coordinates associated with the macroscopic detector. As long as this detector, or any other macroscopic object like a piece of paper on which we wrote these values, continues to exist in a sense that it doesn't explode into elementary particles, we can, with fantastic accuracy, use Bayes' rule of conditioning (on those particular values recorded) to predict probabilities of future observations. If those macroscopic objects which recorded our observations by means of collective coordinates cease to exist in the above mentioned sense, then we must go back and use the previous probability distribution before such conditioning was done.
Because a real observer does not always have enough capacity for information processing, to resolve and make the inference of the detailed unitary evolution, before the system is changing or the observer is forced to interact. It is possible only for the case where the quantum system is a small subsystem and the observer is dominant (and classical). This is why the laws of physics appear timeless only for small subsystems, and small timescales. Time evolution can not generally be inferred not be exactly unitary with certainty, even in principle. In a textbook example, the hamiltonian of a black box may be given, but considering a real observer, even the hamiltonian needs to be inferred, not just the initial state. So the observes "information about laws" and it states (that the laws presumably evolve) should somehow be treated more by equal standard.

/Fredrik
 
  • #296
WernerQH said:
Thanks, I'm familiar with all this.

You probably meant to write properties of a system. John Bell argued that a word like "system" (just like "apparatus" or "measurement") should have no place in a rigorous formulation of quantum theory ("Against Measurement").
Ok, then find an alternative word. I don't know, why it is forbidden to use standard language for well-defined things. A system is something we observe of course.
 
  • #297
WernerQH said:
The formalism is perfect. But I do wonder what properties you were referring to ("properties of an ..."?). Saying that quantum theory is about observables sounds empty to me. Almost like "Classical Mechanics is about differential equations."
Put "system" or "object". It's just a typo.
 
  • #298
Lord Jestocost said:
It seems that others didn't think like that. For example, John Bell in „BERTLMANN’S SOCKS AND THE NATURE OF REALITY“ (Journal de Physique Colloques, 1981, 42 (C2), pp.C2-41-C2-62):

Fourthly and finally, it may be that Bohr's intuition was right - in that there is no reality below some 'classical' 'macroscopic' level. Then fundamental physical theory would remain fundamentally vague, until concepts like 'macroscopic' could be made sharper than they are today.
What's vague is not clear to me. QT is the most successful theory we have today. One just has to accept that on a fundamental level the values of observables are indetermined and the probabilistic description provided by quantum states is all there is to "reality". I still don't know what Bell specifically means by "reality". For me it's the objectively observable behavior of Nature.
 
  • Like
Likes AlexCaledin
  • #299
vanhees71 said:
Ok, then find an alternative word. [...]
A system is something we observe of course.
An alternative is "event". The problem is of course not the word, but its connotations, and whether or not they are made explicit. The word "object" is almost as bad as "system". We think of an object as existing for at least some interval of time. Many think of "photons" as traveling from a source to the detector, but it is more appropriate to speak of a pair of emission and absorption events, localized in time. QFT is better viewed as a statistical theory of events (points in spacetime). I see the "state" of an object not as something physical, but as a characterization of the correlations between events.
 
  • Like
Likes AlexCaledin and vanhees71
  • #300
Sure. That holds for the classical em. field too. What we observe are intensities at some time at some place (quantified by the energy density ##1/2(\vec{E}^2+\vec{B}^2)##). Why this is so follows also from (semiclassical) QT: What we observe are, e.g., electrons emitted in the detector medium via the photoelectric effect. Using the dipole approximation you find out that the emission probability at the given location of an atom/molecule of the detector material is indeed proportional to the energy density of the em. field.
 
  • #301
vanhees71 said:
What we observe are intensities at some time at some place (quantified by the energy density ##1/2(\vec{E}^2+\vec{B}^2)##).
This is not necessarily true, and not just because ##1/2(\vec{E}\cdot\vec{D}+\vec{B}\cdot\vec{H})## is a more appropriate expression for the energy density. If you put a CCD detector in the path of the light, the component of the Poynting vector perpendicular to the detector surface might be a more appropriate description for what you will observe. Or if you use a photoresist of a certain thickness with a given refractive index and absorption coefficient, then just multiplying the energy density with the absorption coefficient and integrating over the volume might not give you the actually absorbed energy. (But I would have to do the detailed computation again. I don't remember the exact details anymore. It was a complicated computation with a simple result. I think it was proportional to the energy density, just the constant of proportionality was slightly surprising.)
 
  • Like
Likes vanhees71
  • #302
In principle I share what also seems to be one of neumaier issue: How to physically motivate the ensemble that lays the basis for the probabilistic framework (wether quantum or classical). Even from my perspective which prefers an agent perspective, this is central. But the imagine solution in qbism mutation are still different from neumaiers idea.

But when the premises of repeatability, information processing etc, to actually construct proper statistics holds - which it does for many situations in a particle lab - then all is fine, but when this does not hold, the soundness of the quantum framework as it stands IMO fails. WHEN this fails, seems to be for example when you consider quantum cosmologt, but also POSSIBLY when one views macroscopic systems with quantum framework.

I personally think however, that we need modification of theory and not just reinterpretation.

/Fredrik
 
  • #303
vanhees71 said:
But in this experiment there are many identically prepared systems using one and the same molecules in a trap. I don't see, why an ensemble shouldn't be realized with one and the same system.
The latter gives an ensemble of ''many identically prepared systems'' only when you can prepare them identically! But the single ion in a trap is at each time in a different state - determined by the Schrödinger equation for trap and measurement device. Thus its time snapshots are ''many nonidentically prepared systems'', for which your postulates say nothing at all!
vanhees71 said:
I also don't understand why you say the ion is not prepared.
I only said that the ion at different times is not identically prepared! Of course it is prepared, but at different times it is prepared in different states!
vanhees71 said:
it's still the same probabilistic meaning as in standard minimally interpreted QT, but then I don't see where's the difference between his and the standard QT interpretation.
Its the same in those cases where it can be derived, namely when you actually have many measurements on identically prepared systems.

It is not the same otherwise since it also allows to derive testable statements for non-identically prepared systems and for single systems, where your interpretation is too minimal to be applicable!
vanhees71 said:
Neumaier seems to believe these cannot be describes within the standard minimal interpretation, but that's not right, because many people in this community of physicists work well with the standard QT
They work with standard QT - but not in the minimal interpretation but in the irrefutable handwaving interpretation, where any intuitive argument is sufficient if it leads to the desired result. Your minimal interpretation is a religion like the other interpretations you are so zealously fighting! In the paper
the most prevailing handwaving interpretation and its relation to the measurement problem is described as follows:
David Wallace said:
Orthodox QM, I am suggesting, consists of shifting between two different ways of understanding the quantum state according to context: interpreting quantum mechanics realistically in contexts where interference matters, and probabilistically in contexts where it does not. Obviously this is conceptually unsatisfactory (at least on any remotely realist construal of QM) – it is more a description of a practice than it is a stable interpretation. […] The ad hoc, opportunistic approach that physics takes to the interpretation of the quantum state, and the lack, in physical practice, of a clear and unequivocal understanding of the state – this is the quantum measurement problem.
WernerQH said:
As Willard Gibbs has shown, it is sufficient for our calculations that we can imagine it.
The strange thing is only that nature behaves according to our calculations though these are only about imagined things! This requires an explanation!
vanhees71 said:
According to quantum theory the properties of an are described by the quantum state, represented by the statistical operator. There is nothing conflicting here. It uniquely tells you the probabilities to find one of the possible values for any observable when you measure them.
There are two approaches to the same mathematical calculus:

  1. Expectation via probability: This is the common tradition since 1933 when Kolmogorov showed how to base probability rigorously on measure theory. But Kolmogorov's approach does not work for quantum probabilities, which creates foundational problems.
  2. Probability via expectation: This was the approach of the founders of probability theory who wanted to know the expected value of games, and introduced probabilities as a way of computing these expectations. If fell out of favor only with Kolmogorov's successful axiomatization of probability. However, in 1970, Peter Whittle wrote a book called ''Probability via expectation'' (the third edition from 2012 is still in print) an axiomatization of expectation in which probabilities were a derived concept and Kolmogorov's axioms could be deduced for them.
From the preface of the first edition:
the principal novelty of the present treatment is that the theory is based on an axiomatization of the concept of expectation, rather than that of a probability measure.
Thus it is now a choice of preference where to start. Probability via expectation is free of measure theory and therefore much more accessible, and as the last chapter in the 2012 edition of Whittle's book shows, it naturally accommodates quantum physics - quite unlike Kolmogorov's approach.

My thermal interpretation views quantum mechanics strictly from the probability via expectation point of view and therefore recovers all traditional probabilistic aspects of quantum mechanics, while removing any trace of measurement dependence from the foundations.

vanhees71 said:
One just has to accept that on a fundamental level the values of observables are indetermined
You 'just' accept it and stop asking further. But many physicists, including great men like t'Hooft and Weinberg find this 'just' glossing over unexplained territory.

vanhees71 said:
vague is not clear to me.
vague in the statistical interpretation is why the measurement of a pointer (a macroscopic quantum system) should give information about the value of a microscopic variable entangled with it. This must be posited as an irreducible postulate in addition to your minimal postulates!
 
Last edited:
  • Like
Likes dextercioby and gentzen
  • #304
Fra said:
But the imagine solution in qbism mutation are still different from neumaiers idea.
What do you mean by "imagine solution in qbism mutation"? Do you mean my short analogy was a misrepresentation of qbism? I was actually more worried that neumaier would find it a misrepresentation of his views. All I wanted to highlight is that there are models (with all their associated structure) in his interpretation, but no agents. In qbism on the other hand, agents play a prime role, and models are not mentioned explicitly, even so I admit that between the lines you could find that they are also part of the picture, as a part of the tools an agent can use. So I guess you protest that my "In QBism, the agent uses QM as a cookbook to update his state." was a mutation of "According to QBism, quantum mechanics is a tool anyone can use to evaluate, on the basis of one’s past experience, one’s probabilistic expectations for one’s subsequent experience." Did I guess correctly?
 
  • #305
A. Neumaier said:
The strange thing is only that nature behaves according to our calculations though these are only about imagined things! This requires an explanation!
The explanation is quite simple: when Nature behaves differently we revise our theories. We celebrate the discovery of a new effect when Nature does not conform to our expectation.
 
  • Like
Likes physicsworks and vanhees71
  • #306
WernerQH said:
The explanation is quite simple: when Nature behaves differently we revise our theories. We celebrate the discovery of a new effect when Nature does not conform to our expectation.
This an empty explanation since it explains everything and nothing.
 
  • #307
A. Neumaier said:
This an empty explanation since it explains everything and nothing.
Think about it.
 
  • Like
Likes physicsworks and vanhees71
  • #308
WernerQH said:
The explanation is quite simple: when Nature behaves differently we revise our theories. We celebrate the discovery of a new effect when Nature does not conform to our expectation.
A. Neumaier said:
This an empty explanation since it explains everything and nothing.
WernerQH said:
Think about it.
It is a truism that can be applied to everything, no matter what it is, and hence is nothing more than an empty phrase.
 
Last edited:
  • #309
gentzen said:
What do you mean by "imagine solution in qbism mutation"? Do you mean my short analogy was a misrepresentation of qbism?
No, actually i have been offline for some weeks prior to this and didn't follow all new posts, i just commented on a response to my old post.

By qbism mutation I simply mean that, my own interpretation (which colours my comments, and goes hand in hand with thinking we need a revision of the theory, not just reinterpretation) is partly in the qbism direction, but a mutation/variant of the common qbism. If I refrain from thinking about modifications, my other interpretation is close to the minimalist one. But the two interpretations have different purposes, the first one is more a guiding principle as well.

/Fredrik
 
  • #310
gentzen said:
What do you mean by "imagine solution in qbism mutation"? Do you mean my short analogy was a misrepresentation of qbism? I was actually more worried that neumaier would find it a misrepresentation of his views. All I wanted to highlight is that there are models (with all their associated structure) in his interpretation, but no agents. In qbism on the other hand, agents play a prime role, and models are not mentioned explicitly, even so I admit that between the lines you could find that they are also part of the picture, as a part of the tools an agent can use. So I guess you protest that my "In QBism, the agent uses QM as a cookbook to update his state." was a mutation of "According to QBism, quantum mechanics is a tool anyone can use to evaluate, on the basis of one’s past experience, one’s probabilistic expectations for one’s subsequent experience." Did I guess correctly?
In my own view, the microstrcutre of information processing going on in the agent, supposedly REPLACES the ensemble fiction. So the inference machiner rests on agent-subjective basis. And the challenge for me is rather to explain that the agents likely will interact in a way that they evolve into agreement, which can approximate observer equivalence.

This is why I keep thinking that the current formulation of QM, corresponds to a dominant non-limiting "agent" that is essentially the whole environment, so that we could almsot think of agents collecting scattering data from the black box, and NOTHING escapes it's processing. Or that the whole environment of classical agents reach an agreement. Then I think that picutre can also be isomorhic to an ensemble view. But this link must be broken when the assymmetry does not hold, and the question is - then how can we understand this, and whatever replaces the ensemsble, and encodes the information about the system?

/FRedrik
 
  • #311
A. Neumaier said:
The latter gives an ensemble of ''many identically prepared systems'' only when you can prepare them identically! But the single ion in a trap is at each time in a different state - determined by the Schrödinger equation for trap and measurement device. Thus its time snapshots are ''many nonidentically prepared systems'', for which your postulates say nothing at all!

I only said that the ion at different times is not identically prepared! Of course it is prepared, but at different times it is prepared in different states!

Its the same in those cases where it can be derived, namely when you actually have many measurements on identically prepared systems.

It is not the same otherwise since it also allows to derive testable statements for non-identically prepared systems and for single systems, where your interpretation is too minimal to be applicable!

They work with standard QT - but not in the minimal interpretation but in the irrefutable handwaving interpretation, where any intuitive argument is sufficient if it leads to the desired result. Your minimal interpretation is a religion like the other interpretations you are so zealously fighting! In the paper
the most prevailing handwaving interpretation and its relation to the measurement problem is described as follows:The strange thing is only that nature behaves according to our calculations though these are only about imagined things! This requires an explanation!

There are two approaches to the same mathematical calculus:

  1. Expectation via probability: This is the common tradition since 1933 when Kolmogorov showed how to base probability rigorously on measure theory. But Kolmogorov's approach does not work for quantum probabilities, which creates foundational problems.
  2. Probability via expectation: This was the approach of the founders of probability theory who wanted to know the expected value of games, and introduced probabilities as a way of computing these expectations. If fell out of favor only with Kolmogorov's successful axiomatization of probability. However, in 1970, Peter Whittle wrote a book called ''Probability via expectation'' (the third edition from 2012 is still in print) an axiomatization of expectation in which probabilities were a derived concept and Kolmogorov's axioms could be deduced for them.
From the preface of the first edition:

Thus it is now a choice of preference where to start. Probability via expectation is free of measure theory and therefore much more accessible, and as the last chapter in the 2012 edition of Whittle's book shows, it naturally accommodates quantum physics - quite unlike Kolmogorov's approach.

My thermal interpretation views quantum mechanics strictly from the probability via expectation point of view and therefore recovers all traditional probabilistic aspects of quantum mechanics, while removing any trace of measurement dependence from the foundations.You 'just' accept it and stop asking further. But many physicists, including great men like t'Hooft and Weinberg find this 'just' glossing over unexplained territory.vague in the statistical interpretation is why the measurement of a pointer (a macroscopic quantum system) should give information about the value of a microscopic variable entangled with it. This must be posited as an irreducible postulate in addition to your minimal postulates!
The Kolmogorov axioms apply to quantum-mechanical probabilities of course only for one given really feasible experiment not for all thinkable experiments. It's indeed very important to keep this in mind.

Further I don't mind, whether you derive the Kolmogorov axioms from some other axioms of probability theory. I don't argue the mathematical foundations at all. That I leave to the mathematicians. What I want to understand is the physical interpretation of your new foundation (I'm not even sure whether it's really a new foundation or just a reformulation of standard QT).

Concerning the preparation of the single atom in the trap, I don't know what you mean. I'm not an expert of this physics, but what I understand from the quoted Nobel citation is that they have one (or a few) ions in a trap and irradiate it with some laser light (i.e., basically a classical em. wave). That's a clear preparation and it's clearly described by standard quantum theory. What's meausured is the distribution of many emitted photons collected over some time of irradiation, and the corresponding photon distribution can be compared to what standard quantum theory predicts. Obviously the agreement is very well.

On the other hand, how would you describe this example in your approach to QT? So how do you interpret the observed photon distribution if not as a probability distribution for the detection of each single photon? Finally, why do you think this to be a superior description of the situation than by standard QT?
 
  • #312
gentzen said:
This is not necessarily true, and not just because ##1/2(\vec{E}\cdot\vec{D}+\vec{B}\cdot\vec{H})## is a more appropriate expression for the energy density. If you put a CCD detector in the path of the light, the component of the Poynting vector perpendicular to the detector surface might be a more appropriate description for what you will observe. Or if you use a photoresist of a certain thickness with a given refractive index and absorption coefficient, then just multiplying the energy density with the absorption coefficient and integrating over the volume might not give you the actually absorbed energy. (But I would have to do the detailed computation again. I don't remember the exact details anymore. It was a complicated computation with a simple result. I think it was proportional to the energy density, just the constant of proportionality was slightly surprising.)
Well, I'm working in the Heaviside Lorentz units, where in a vacuum (and I assume that we measure free photons, because there photons have a clear meaning) ##\vec{E}=\vec{D}## and ##\vec{B}=\vec{H}##. Also it's easy to show that the photon-detection probability when using the photoelectric effect (e.g., with a photomultiplier or a CCD cam) is proportional to the energy density of the em. field. See, e.g., Garrison, Chiao, Quantum optics.
 
  • Like
Likes physicsworks
  • #313
Fra said:
By qbism mutation I simply mean that, my own interpretation (which colours my comments, and goes hand in hand with thinking we need a revision of the theory, not just reinterpretation) is partly in the qbism direction, but a mutation/variant of the common qbism. If I refrain from thinking about modifications, my other interpretation is close to the minimalist one. But the two interpretations have different purposes, the first one is more a guiding principle as well.
So if I understand you correctly, you are quite happy with your other interpretation close to the minimalist one. No need for a revision of the theory from that perspective.

Your variant of the common qbism interpretation on the other hand seems to be the way to go for you in the long run. But you are currently not completely happy with it, and it will require more than just a reinterpretation. But this is were you expect to find the real solution to the mysteries. And those mysteries include that the world around us seems to contain lots of agents, other humans definitively count as such agents, animals probably count too, it is just unclear where to stop. But the real mystery is "to explain that the agents likely will interact in a way that they evolve into agreement, which can approximate observer equivalence."
 
  • #314
vanhees71 said:
Also it's easy to show that the photon-detection probability when using the photoelectric effect (e.g., with a photomultiplier or a CCD cam) is proportional to the energy density of the em. field. See, e.g., Garrison, Chiao, Quantum optics.
And still you could be quite mislead if you evaluated the energy density in vacuum at a surface (where a CCD cam would be placed) as a way to get a first rough prediction of what a CCD cam would measure. It is a different story if you include a model of your CCD cam with the actual geometry and optical material parameters at the relevant frequency in your simulation, and then take the energy density inside the relevant part of the semiconductor. The energy density in vacuum is not a good predictor for that, at least not at frequencies where the dielectric constant of the semiconductor is still significantly different from 1.
 
  • Like
Likes vanhees71
  • #315
gentzen said:
So if I understand you correctly, you are quite happy with your other interpretation close to the minimalist one. No need for a revision of the theory from that perspective.
Actually, my interpretation, when taking it seriously, more or less suggests that the current theory can not be the final answer because of it's "form", it simply is not constructable in terms of an intrinsic inference. So a revision of the theory is required. I am driven by some of the open problems with unification and fine tuning etc. Ie. my interpretation suggests/requires a reconstruction. This is why its more than a "plain interpretation".

gentzen said:
Your variant of the common qbism interpretation on the other hand seems to be the way to go for you in the long run. But you are currently not completely happy with it, and it will require more than just a reinterpretation. But this is were you expect to find the real solution to the mysteries. And those mysteries include that the world around us seems to contain lots of agents, other humans definitively count as such agents, animals probably count too, it is just unclear where to stop. But the real mystery is "to explain that the agents likely will interact in a way that they evolve into agreement, which can approximate observer equivalence."
Yes, this is an open question. I do not have the answers, but I see plenty of clues and hints. My exploit is not to start at the comlpex end, but at the minimal complex end (which means the highest energy end), there I execpt, that the options are finite. These - like string theory - are not directly observable, but the idea is that logical construction principles of a sound inference should guide is. If this works out, some low energy parameters should follow from a self-organisating once the agents form higher complex systems. In principle, the simplest possible agent, would correspond to say ultimate lementary particles or the ultimate quanta of energy.

/Fredrik
 

Similar threads

Replies
1
Views
1K
Replies
4
Views
1K
Replies
5
Views
224
Replies
25
Views
3K
Replies
49
Views
4K
Replies
84
Views
4K
Back
Top