- #1
- 14,342
- 6,824
The editors of high impact journal Nature Physics explain why the field of quantum foundations is important for physics.
https://www.nature.com/articles/s41567-022-01766-x
https://www.nature.com/articles/s41567-022-01766-x
I agree that there are loose ends. But I fail to see much progress:Demystifier said:The editors of high impact journal Nature Physics explain why the field of quantum foundations is important for physics.
https://www.nature.com/articles/s41567-022-01766-x
Nothing fundamental has changed in the way we do quantum mechanical calculations. Now we have just more no-go theorems, peculiar inequalities, and a proliferation of interpretations, rather than having found the one and only natural interpretation of quantum theory. Most of the work on quantum foundations seems to have explored blind alleys. Much like the search for a mechanical model of the ether, which once was an obvious and important field of research. Maxwell himself might have realized that there is no need for an ether, if he had had more time. For his contemporaries electrodynamics (or rather the ether) had some "weird" features, that took four decades to get rid of. In the case of quantum theory it seems to take significantly longer to remove superfluous metaphysical baggage.Although a fresh view can invigorate any field, much of this work also manifests a disregard for the progress that has been made since quantum mechanics was established. The quantum foundations literature is the product of decades of careful thought about the issues involved in understanding and interpreting the physical world. As with any topic, a failure to constructively engage with existing work runs the risk of repeating earlier mistakes and misunderstandings.
The particle concept is problematic, and even talk about quantum particles doesn't remove its misleading connotations. My view is that quantum theory makes much more sense if it is formulated without reference to "particles" and "measurements".But the particle can at the same time be entangled with another particle located elsewhere such that the outcome of measuring one particle determines the state of the other.
How do you formulate the Born rule in different bases (say position basis and momentum basis) without measurements?WernerQH said:My view is that quantum theory makes much more sense if it is formulated without reference to "particles" and "measurements".
Quantum theory makes predictions about events. It is customary to compute a probability amplitude and take the squared modulus as an extra step. But this dichotomy of unitary evolution and "measurements" is artificial. Schwinger fused them in a coherent (closed time path) formalism a long time ago. In his generalized quantum action principle the Born rule is built-in:Demystifier said:How do you formulate the Born rule in different bases (say position basis and momentum basis) without measurements?
[J.Math.Phys. 2, 407-432, (1960)][...] if a system is suitably perturbed in a manner that depends upon the time sense, a knowledge of the transformation function referring to a closed time path determines the expectation value of any desired physical quantity [...]
Agreed, but when Schwinger talks about "perturbed" system in your quote above, doesn't he introduce a similar artificial dichotomy between perturbed and unperturbed?WernerQH said:But this dichotomy of unitary evolution and "measurements" is artificial.
It is widely believed that QFT hinges on perturbation theory. But a more general viewpoint is that of the path integral as providing a generating functional from which all quantities of interest (especially correlations) can, in principle, be derived by simple differentiation.Demystifier said:Agreed, but when Schwinger talks about "perturbed" system in your quote above, doesn't he introduce a similar artificial dichotomy between perturbed and unperturbed?
In general, the correlation functions of observables at different times computed this way do not correspond to correlations observed by measuring observables at different times. This is because such a computation does not take into account a change of the state induced by measurement. This change of the state, known under the names information update, projection or collapse, is not a unitary transformation. How does Schwinger, or anyone else who uses a path integral formalism and claims that there is nothing special about measurements, account for this?WernerQH said:It is widely believed that QFT hinges on perturbation theory. But a more general viewpoint is that of the path integral as providing a generating functional from which all quantities of interest (especially correlations) can, in principle, be derived by simple differentiation.
Yes, I think one could also say that it's nature that is weird. When corroborated theory is seen from the outdated stance that nature is NOT weird, it really gets weird. We can resolve this mismatch if we adopt a weird take on nature :)WernerQH said:I think it is not quantum theory that is weird, but the way it is phrased / taught:
I believe you have this backwards. What you are talking about is a theoretian's idealized measurement that has little to do with real experiments. Applied to a harmonic oscillator, a position measurement would imply adding an infinite amount of kinetic energy, if the imagined collapsed state really were an eigenstate of the position operator. Contrariwise, there is little reason to doubt that the correlation function ## \langle x(t)x(0) \rangle ## computed for an unperturbed oscillator is a useful first approximation to measurement results. And of course this first approximation can be improved by including perturbing effects (as done by Schwinger).Demystifier said:[...] such a computation does not take into account a change of the state induced by measurement. This change of the state, known under the names information update, projection or collapse, is not a unitary transformation.
I am not attempting to contribute to this discussion-above my competence , but could I perhaps ask where I might read up on this particular point?vanhees71 said:Wave-particle duality is no phenomenon but a theoretical concept that's outdated for about 100 years.
That's also my view (that it's mainstream), although I wouldn't call it an explanation but rather a mere statement of the fact that the so-called quantum objects are neither particles nor waves, but share properties of both.geordief said:I was under the impression that the wave-particle duality was indeed a very strong mainstream scientific explanation and had never come across such a statement as yours up till now.
It has absolutely nothing to do with idealization. Even realistic POVM measurements involve a non-unitary change of state upon measurement. Or do you disagree?WernerQH said:What you are talking about is a theoretian's idealized measurement that has little to do with real experiments.
Also POVM measurements are an idealization. I think it's not useful to talk about the "state" of a "system" at a particular instant of time. (We integrate over all possible states when we use Schrödinger's equation.) What quantum theory predicts is the probabilities of particular sequences of events (or "histories", if you like).Demystifier said:It has absolutely nothing to do with idealization. Even realistic POVM measurements involve a non-unitary change of state upon measurement. Or do you disagree?
The wave-particle duality was an important heuristic idea before modern quantum mechanics has been discovered in 1925/26, and the main protagonists (among them Einstein and de Broglie) were well aware of the problems with this idea. It was solved with Born's probabilistic interpretation of the quantum states, i.e., that the modulus squared of the Schrödinger wave function is a probability distribution for the position of a particle, described by this wave function.geordief said:I am not attempting to contribute to this discussion-above my competence , but could I perhaps ask where I might read up on this particular point?
I was under the impression that the wave-particle duality was indeed a very strong mainstream scientific explanation and had never come across such a statement as yours up till now.
Maybe you could give me some kind of a pointer so that I could perhaps understand the point you are making?
(quite probably I have misunderstood ...)
Photons are of course another complication. For photons you cannot even define a position observable to begin with, and the only (very) successful forumulation of relativistic QT is local (microcausal) relativistic QFT. For a very clear discussion, seeWernerQH said:That's also my view (that it's mainstream), although I wouldn't call it an explanation but rather a mere statement of the fact that the so-called quantum objects are neither particles nor waves, but share properties of both.
@vanhees71 is tired of explaining that photons are not "little bullets" (and he's right!)
I think his statement is a little exaggerated.
Are you suggesting that consistent histories interpretation is the way to go, is it how you avoid a reference to measurement?WernerQH said:What quantum theory predicts is the probabilities of particular sequences of events (or "histories", if you like).
I'm not a fan of consistent/decoherent histories. My preference is a blend of the statistical and transactional interpretations, and ... GRW. At least the original GRW is (to my mind) rather ad hoc, but I like the idea that only events are real. A short coordinated "wiggling" of electrons in a detector, for example, constitutes what we could call the measurement of the polarization of a photon.Demystifier said:Are you suggesting that consistent histories interpretation is the way to go, is it how you avoid a reference to measurement?
That's what I dislike about consistent histories: preferred variables / frameworks. It is too non-committal about the actual ontology, about what really happens. In the Schwinger-Keldysh formalism the "series of projections" arises naturally through creation and annihilation operators at adjacent points of the forward and backward time branch. For Schwinger the closed time path (with a "backward" branch) may have been a purely formal device, but I think that the backwards running time has physical significance, that physical events do occur in close pairs, that there are two world-sheets with opposite sense of time tacked together. The forward evolution of a ## \ket{\text{ket}} ## according to ## e^{-iHt} ## is only half the story. The Schwinger-Keldysh formalism includes the other half and the Born rule as well.Demystifier said:Note that the usual formula for probability of a history involves products of unitary evolution operators and non-unitary projectors, which in the standard interpretation is interpreted in terms of a series of projections induced by measurements at different times.
Thanks, I already feared that Morbert or me would now have to clarify which of the things you mentioned must be interpreted differently from the histories perspective.WernerQH said:I'm not a fan of consistent/decoherent histories.
WernerQH said:At least the original GRW is (to my mind) rather ad hoc, but I like the idea that only events are real.
WernerQH said:... preferred variables / frameworks. It is too non-committal about the actual ontology, about what really happens.
Sure, events are nice as ontology, because they provide a clear connection to spacetime, and are somewhat minimal. However, coherent states could be nice too, because they provide a clear connection to classical mechanics, and are "more compatible" with decoherence than purely spacetime based ontologies.WernerQH said:For Schwinger the closed time path (with a "backward" branch) may have been a purely formal device, but I think that the backwards running time has physical significance, that physical events do occur in close pairs, that there are two world-sheets with opposite sense of time tacked together.
I have no idea whether my "interpretation of the events" has anything to with what really happened.This paper refers to the version of RQM that existed before the introduction of "cross-perspective links" in arXiv:2203.13342, a change that amounts to saying, "Well, we didn't want all those 'relative facts' anyway."
RQM never had much appeal to me. I find "the particle is at x at time t” too unspecific a characterization of an event. The events that I have in mind are interactions of electrons and photons, for example. Through the fluctuation/dissipation theorem the photon emission rate in a medium can be expressed in terms of a Fourier integral over the current density fluctuations, hinting (to me, at least) at the possibility that the emission of a photon relates in the real world to two closely spaced, short-lived currents.gentzen said:However, the reason why I comment on your choice of events as the preferred ontology is that I recently read the SEP article on Relational Quantum Mechanics (from Winter-2021). I was surprised, that it contained statements like "Facts as this one (“the particle is at x at time t”) are called “events”, or “quantum events”. Quantum theory is about events."
gentzen said:But those events might still provide a "slight warning" about being too commital to your ontology, just because it makes sense to you and you intuitively like it.
Thanks for your warning. But I can't make sense of coherent states as something "real" - it's more a piece of mathematics (holomorphic representation) to me. As to the connection to the classical world I think that the continuous world lines of classical particles have to be replaced in the quantum picture by dotted lines, the incessant interactions of an electron with the electromagnetic field. The world looks classical when you reduce the time resolution.gentzen said:However, coherent states could be nice too, because they provide a clear connection to classical mechanics, and are "more compatible" with decoherence than purely spacetime based ontologies.
Oh, sorry. I should have made clearer what I meant by the warning, and what I meant by "overcommited" ontology. Events or flashes (like in GRW) are fine as ontology, as long as there are not "too many" (or as long as they are not "too precise"). Basically Bohmian mechanics is the only interpretation known to me that managed to have a maximally refined microscopic ontology without getting inconsistent or making different predictions from standard QM.WernerQH said:Thanks for your warning. But I can't make sense of coherent states as something "real"
Why is that? I would expect them to occur at the zeptosecond scale (## 10^{-21} {\rm s} ##), corresponding to the electron mass. I looked at two papers by Tumulka, but they were quite different from what I have in mind. What kind of trouble do you anticipate?gentzen said:The flashes are OK as ontology, but you can only have very few of them.
I am not sure I understand Werners line of thinking. I assume you both by event, refers to a "4D spacetime" event?gentzen said:Sure, events are nice as ontology, because they provide a clear connection to spacetime, and are somewhat minimal.
If you add a "non-emergent" ontology, then you risk getting a different theory, making "slightly" different predictions. Why? Probably because an ontology risks to make everything too exact, and to remove too much vagueness. Bohmian mechanics has its equilibrium distribution, to bring back the vagueness. Without something similar, you are often forced to "thin out" your ontology, to avoid making experimentally verifiable predictions that differ from standard QM.WernerQH said:Why is that? I would expect them to occur at the zeptosecond scale (## 10^{-21} {\rm s} ##), corresponding to the electron mass. I looked at two papers by Tumulka, but they were quite different from what I have in mind. What kind of trouble do you anticipate?
We are probably talking past each other. Why should vagueness be an important ingredient? For me, QED is a fundamentally statistical theory. Does randomness constitute enough "vagueness"? I don't want to create a new theory. I think QED is perfect, and I only aim to see it more clearly as a theory of a special kind of point process in spacetime (actually two five-dimensional manifolds glued together - I'm lacking the proper mathematical term).gentzen said:Probably because an ontology risks to make everything too exact, and to remove too much vagueness.
So much, said so well, and yet, with very little in the way of a bottom line or an action item. Mostly, what keeps it a clean and elegant, but ultimately useless discussion, is that it doesn't engage with competing proposed resolutions of the open questions and what is at stake if one or another of them is adopted.Demystifier said:The editors of high impact journal Nature Physics explain why the field of quantum foundations is important for physics.
https://www.nature.com/articles/s41567-022-01766-x
Not really a fair criticism.vanhees71 said:Quantum theory is NOT weird but the most comprehensive theory about Nature we have today.
That is quite possible. My reaction to your "intended interpretation" was dominated by your reference to GRW (paired with my limited knowledge of why such objective collapse theories differ from standard QM):WernerQH said:We are probably talking past each other.
GRW is known to allow both a "flash ontology" (or "flashes ontology") and a "mass density ontology". (I guess that some objective collapse theories are commited to a "mass density ontology", i.e. theories like the Diósi–Penrose model.) So I guessed that your "events" would be similar to the "flashes" for GRW.WernerQH said:My preference is a blend of the statistical and transactional interpretations, and ... GRW. At least the original GRW is (to my mind) rather ad hoc, but I like the idea that only events are real. A short coordinated "wiggling" of electrons in a detector, for example, constitutes what we could call the measurement of the polarization of a photon.
The "flashes" for GRW have infinitely accurate spacetime coordinates. For GRW, even their randomness seems to be not enough to get rid of that excess accuracy again. But for Bohmian mechanics, the randomness is sufficient, so an unconditionally true answer to that question seems impossible.WernerQH said:Why should vagueness be an important ingredient? ... Does randomness constitute enough "vagueness"?
I guess that the mechanism for QFT to get rid of the excess accuracy (related to "point process in spacetime") is renormalization. You find "points of view" like the following in modern QTF1 lecture notes:WernerQH said:For me, QED is a fundamentally statistical theory. ... I don't want to create a new theory. I think QED is perfect, and I only aim to see it more clearly as a theory of a special kind of point process in spacetime
Many points of view; one is that it is our own fault: QFT is somewhat idealised; it assumes infinitely extended fields (IR) with infinite spatial resolution (UV); there is no wonder that the theory produces infinities. Still, it is better to stick to idealised assumptions and live with infinities than use some enormous discrete systems (actual solid state systems).
There is also a physics reason why these infinities can be absorbed somehow: Our observable everyday physics should neither depend on how large the universe actually is (IR) nor on how smooth or coarse-grained space is (UV).
Hmmm... I don't think it's how my brain is wired. My brain, and I think yours, makes use of inferences, abduction, lossy retention and actions influenced by subjective expectations that has been tuned by evolution, even though we may not think of it. These things are IMO in excellent harmony with quantun weirdness if you only embrace the inside observer view So I see good reasons why we WILL ultimately see how natural QM is, and we will look back and wonder how Newtons mechanics ever made senseohwilleke said:Every shred of physical intuition gained from daily life, some of it hard wired into our brains, is based on a classical worldview that works for Newtonian mechanics and Maxwell's equations
I don't subscribe to the view that information is physical. It belongs to our theories and models. And we can safely ignore excess digits.gentzen said:The trouble with excess accuracy is that the information content of a system with a finite energy in a finite spacetime region should better not be infinite. It is convenient to work with real numbers for mathematical models, but their infinite accuracy forces you to have some mechanism (like "vagueness") to avoid that their infinite accuracy has experimentally observable consequences.
Yes, it is useful (no, necessary!) to ignore the scales that are not relevant. We couldn't do physics otherwise.gentzen said:I guess that the mechanism for QFT to get rid of the excess accuracy (related to "point process in spacetime") is renormalization.
Thank you. Your posts always contain interesting pointers.gentzen said:You find "points of view" like the following in modern QTF1 lecture notes: