Quantum mechanics via quantum tomography

In summary: This should be sufficient for the goal of the paper.In summary, this paper gives a new, elementary, and self-contained deductive approach to quantum mechanics. A suggestive notion for what constitutes a quantum detector and for the behavior of its responses leads to a logically impeccable definition of measurement. Applications to measurement schemes for optical states, position measurements and particle tracks demonstrate that this definition is applicable without any idealization to complex realistic experiments.
  • #1
A. Neumaier
Science Advisor
Insights Author
8,633
4,681
TL;DR Summary
A new, elementary, and self-contained deductive approach to quantum mechanics
I just finished a new paper,
(later renamed to)
Abstract:

Starting from first principles inspired by quantum tomography rather
than Born's rule, this paper gives a new, elementary, and self-contained
deductive approach to quantum mechanics. A suggestive notion
for what constitutes a quantum detector and for the behavior of its
responses leads to a logically impeccable definition of measurement.
Applications to measurement schemes for optical states, position
measurements and particle tracks demonstrate that this definition is
applicable without any idealization to complex realistic experiments.

The various forms of quantum tomography for quantum states, quantum
detectors, quantum processes, and quantum instruments are discussed.
The traditional dynamical and spectral properties of quantum mechanics
are derived from a continuum limit of quantum processes. In particular,
the Schrödinger equation for the state vector of a pure, nonmixing
quantum system and the Lindblad equation for the density operator of
a mixing quantum system are shown to be consequences of the new
approach. A slight idealization of the measurement process leads to the
notion of quantum fields, whose smeared quantum expectations emerge as
reproducible properties of regions of space accessible to measurements.

The paper may be viewed as a derivation of my thermal interpretation of quantum physics from first principles.

Now there is a related Insight article, Quantum Physics via Quantum Tomography: A New Approach to Quantum Mechanics, with a more extensive overview of the new approach.
 
Last edited:
  • Like
  • Informative
Likes atyy, PeroK, AlexCaledin and 5 others
Physics news on Phys.org
  • #2
vanhees71 said:
Are you trying to "derive" the Hilbert-space structure of QM from some notion of POVMs? What is then the underlying axiomatics? I've no idea, how to define POVMs without assuming the Hilbert-space structure to begin with. What's the goal of such an endeavor? Is it possible to make it also digestible for physicists or is it a purely mathematical "l'art pour l'art"?
See the abstract, quoted here. The paper is full of physics, unlike most papers on foundations. It is also written in an attempt to make it digestible to a wide readership. I am sure that you'll learn a lot. Starting with density operators (motivated by properties of polarized light) everything is derived from a single postulate defining the meaning of measurement.
 
Last edited:
  • Like
Likes gentzen, bhobba and vanhees71
  • #3
My only question before reading the paper is, whether I'm allowed to think of the trace rule in the standard way, i.e., that ##\mathrm{Tr} (\hat{\rho} \hat{A})## is allowed to be understood as taking the average in the usual statistical interpretation of the statistical operator. If so, I guess, I can understand the physics picture behind your approach. Otherwise I'm again lost even before I start reading.
 
  • #4
vanhees71 said:
My only question before reading the paper is, whether I'm allowed to think of the trace rule in the standard way, i.e., that ##\mathrm{Tr} (\hat{\rho} \hat{A})## is allowed to be understood as taking the average in the usual statistical interpretation of the statistical operator.
Approximately yes, but details matter.

The statistical interpretation is not assumed, but the version of it that emerges is discussed in detail in Subsection 2.3.
 
Last edited:
  • Like
Likes gentzen and vanhees71
  • #5
Is there a simplified version of your theory for "dummies"?
 
  • Like
Likes Spinnor
  • #6
Demystifier said:
Is there a simplified version of your theory for "dummies"?
Not yet, but I'll try to create something. It will take some time since simplification is hard work...
 
  • Like
Likes Spinnor, msumm21, vanhees71 and 2 others
  • #7
A. Neumaier said:
The paper may be viewed as a derivation of my thermal interpretation of quantum physics from first principles.
I'll try to get around to read more sometime, but from what i remember from your interpretations I share part of your objections (ie motivations for your quest), which is the physical motivation for the statistical abstractions! For example, the limitations of PVM as it's premise seems unattainable. But the last time I wasn't satisfied to your alternative starting point. Would you say this papers changes this, or would it me more pleasign to someone from the "inference agent" perspective?

/Fredrik
 
  • #8
A. Neumaier said:
Not yet, but I'll try to create something. It will take some time since simplification is hard work...
What would help a lot were a clear statement of what you call "first principles" and then a derivation of the POVM and the more conventional special case of projective measurements without too much philosophy in between.
 
  • Like
Likes Demystifier
  • #9
vanhees71 said:
What would help a lot were a clear statement of what you call "first principles" and then a derivation of the POVM and the more conventional special case of projective measurements without too much philosophy in between.
Already in its present form (v2), the main part (Sections 2 to 5.4, pp.8-46) is free of philosophy (if one ignores the quotes, which is easy to do). The POVM is derived in Section 2.2 (Theorem 2.1 on p.12). Projective measurements are of no true relevance in the paper, but are discussed in Section 3.5-3.6 (pp.20-23) to connect to tradition. It is there that eigenvalues make their first appearance.

For the sake of simplicity, first principles may be taken to mean (a small subset of) the physics up to the year 1855 (one year later, In 1856, Maxwell published his first paper on electrodynamics, which lead in 1865 to the Maxwell equations), plus linear algebra (matrices, vector spaces, and eigenvalues). Neither the notion of a Hilbert space (completeness) nor the spectral theorem are assumed or used, except in some side discussion.

Demystifier said:
Is there a simplified version of your theory for "dummies"?
What I'll add is a formula-free Ariadne's thread for dummies.

Fra said:
from what i remember from your interpretations I share part of your objections (ie motivations for your quest), which is the physical motivation for the statistical abstractions! For example, the limitations of PVM as it's premise seems unattainable. But the last time I wasn't satisfied to your alternative starting point. Would you say this papers changes this, or would it me more pleasing to someone from the "inference agent" perspective?

/Fredrik
The starting point is completely new; the thermal interpretation is derived at the end (starting in Section 5.5). For the inference agent perspective see Section 5.5..
 
Last edited:
  • Like
Likes dextercioby, Fra and vanhees71
  • #10
A. Neumaier said:
What I'll add is a formula-free Ariadne's thread for dummies.
Looking forward! :smile:
 
  • Like
Likes AlexCaledin
  • #11
A. Neumaier said:
The starting point is completely new; the thermal interpretation is derived at the end (starting in Section 5.5). For the inference agent perspective see Section 5.5..
(As this is complex, I will try to keep myself short and focus on one thing at a time for a change)

About your idea or goal:

Am I understanding you right, that your thermal interpration is supposed to be if not the ultimatey answer, but at least in resonance with C. Fuchs "dream"? Ie. the one that you quoted in your paper?

"The dream I see for quantum mechanics is just this. Weed out all the terms that have to do
with gambling commitments, information, knowledge, and belief, and what is left behind will play
the role of Einstein’s manifold
. That is our goal. When we find it, it may be little more than a
miniscule part of quantum theory. But being a clear window into nature, we may start to see sights
through it we could hardly imagine before."
-- C. Fuchs, https://arxiv.org/pdf/quant-ph/0205039.pdf, C. Fuchs

Are you suggesting that your thermal interpretation is the correspondence of the manifold in QM, in line with Fuchs thinking?

/Fredrik
 
  • Like
Likes Pony
  • #12
Well, the recent developments in quantum theory, which were somewhat triggered by all these quibbles with the foundations, point to the opposite view: The information-theoretical approach to (not only) quantum physics gets more and more confirmed. Most directly this is illustrated by the successful experimental realization of quantum Maxwell demons with precisely the "solution" of the corresponding apparent paradox provided very early by Szilard and Landauer, while Einstein's, Schrödinger's and Fuchs's dream of a unified classical fiedl description seems not come to be true.
 
  • #13
vanhees71 said:
Well, the recent developments in quantum theory, which were somewhat triggered by all these quibbles with the foundations, point to the opposite view: The information-theoretical approach to (not only) quantum physics gets more and more confirmed. Most directly this is illustrated by the successful experimental realization of quantum Maxwell demons with precisely the "solution" of the corresponding apparent paradox provided very early by Szilard and Landauer, while Einstein's, Schrödinger's and Fuchs's dream of a unified classical fiedl description seems not come to be true.
Is this a comment to something I wrote?

(If so, I am advocating an information theoretic approach. I just question at which level and where the information is, vs should be, encoded, and wether information itself should be understood as observer independent or not.)

/Fredrik
 
  • Like
Likes vanhees71
  • #14
Yes this is referring to your posting #11.
 
  • #15
vanhees71 said:
Einstein's, Schrödinger's and Fuchs's dream of a unified classical fiedl description seems not come to be true.
Aha, I now see why you wrote what you did! You lumped these together, while I didn't. Then your comment makes more sense to me.

First I will admit that I haven't read all Fuchs papers in detail, buy my interpretation of his "dream", was not a classical thing. Eeven what kind of animal it IS, remains to find out. I do hint from what fuchs writings than he seeks a more closed solution, and I do not share that view, but I didn't mention it as things always get unreadable when thinking of too many things at once, which unavoidable causes misunderstandings.

I took it to mean an analogy, but even in general relativity the manifold is evolving, so one has a similar issure there. And the evolution depends on the distribution of mass and energy, and where there is mass, there are suppsedly agents. But in this sense, my take on this is that the correspondence of the "manifold" are the laws of physics, and I think such an object is not comparable to a classical manifold, except in the sense that it consitutues the agent-invariants and an equivalence class.

At least what I have been reading about Einsteins dream, I understand as something more classical that what I think we will find that we need, and what i though Fuchs also meant, but i could be mistaken on Fuchs.

/Fredrik
 
  • Like
Likes vanhees71
  • #16
For me the problem is that I don't understand which problem they want to solve. For me with Bell's work and the following experimental confirmation that (relativistic) Q(F)T gives the answer that nature follows QT rather than local deterministic hidden-variable theories a la EPR. This case at least is closed, and we have to accept QT as the best theory we have today.

The really remaining problem is not a foundational, philosophical problem but a physical one, i.e., to find a quantum theory of gravitation with the hope to eliminate the inevitable singularities of the classical theory (General Relativity) we have now.
 
  • #17
vanhees71 said:
For me the problem is that I don't understand which problem they want to solve. For me with Bell's work and the following experimental confirmation that (relativistic) Q(F)T gives the answer that nature follows QT rather than local deterministic hidden-variable theories a la EPR. This case at least is closed, and we have to accept QT as the best theory we have today.
The point of Bell's theorem was a philosophical one - to show that Einstein causality has to be given up if one wants to preserve a realistic worldview. Falling back to the Lorentz ether is fine, it is what solves all the quantum foundational problems (as shown by the realist interpretations), and the costs are minimal. Einstein-causal hidden variables have to be given up, and there is no need for them once there are classical-causal hidden variables.

The problem solved is preservation of realism and causality (strong, reasonable causality, with common cause principle, not weak signal causality). Both are strong guiding principles for science in general, one can interpret them as part of the logic of science, rejecting them has similar catastrophic properties as superdeterminism, observing correlations becomes meaningless because correlations no longer require realistic explanations, and we can go back to astrology.

This problem remains invisible to the opponents simply because they apply those rules as usual everywhere except in discussions about the consequences of Bell's theorem.
vanhees71 said:
The really remaining problem is not a foundational, philosophical problem but a physical one, i.e., to find a quantum theory of gravitation with the hope to eliminate the inevitable singularities of the classical theory (General Relativity) we have now.
We know how to quantize GR as an effective field theory, based on the field-theoretic version of GR (which has a flat background). We know how to regularize field theories in such a way that they become finite - with lattice regularizations. So I don't know where the problem is.

Except for the religious metaphysical point that a background is evil.
 
  • Like
Likes Demystifier
  • #18
Sunil said:
The problem solved is preservation of realism and causality (strong, reasonable causality, with common cause principle, not weak signal causality). Both are strong guiding principles for science in general, one can interpret them as part of the logic of science, rejecting them has similar catastrophic properties as superdeterminism, observing correlations becomes meaningless because correlations no longer require realistic explanations, and we can go back to astrology.

Is it catastrophic though? Most physicists happily use conventional QM/QFT as their ab initio bedrock, and research is flourishing.
 
Last edited:
  • Like
Likes gentzen
  • #19
Morbert said:
Is it catastrophic though? Most physicists happily use conventional QM/QFT as their ab initio bedrock, and research is flourishing.
I explained why. They continue to use in everyday life as well as in their everyday research realism and causality without even thinking about the justification for this. That they question this justification in discussions about Bell's inequality they don't even recognize.

Questioning common sense is in general quite harmless, because you don't have to be consistent in your real life. You can question common sense and use it at the same time. You contradict yourself in this way? So what? You use common sense, that it not dangerous at all, and you use some anti-common-sense arguments in some philosophical discussions, which have no consequences anyway. So, such self-contradictions are harmless.

Those who would care about consistency and, after rejecting common sense in a philosophical discussion also reject it in reality, would live very dangerously. With danger not only for themselves, but also for their environment. But I doubt there will be many scientists among them.
 
  • Like
Likes akvadrako and gentzen
  • #20
Sunil said:
I explained why. They continue to use in everyday life as well as in their everyday research realism and causality without even thinking about the justification for this. That they question this justification in discussions about Bell's inequality they don't even recognize.
I'm not sure I understand this. Are you saying the average researcher implicitly interprets QM in a realist manner, until the moment Bell inequalities comes up?
 
  • #21
I think we all have different motivations. My core motivation and vision is radically different than Bomhians, and probably different than Sunil's etc.

vanhees71 said:
For me with Bell's work and the following experimental confirmation that (relativistic) Q(F)T gives the answer that nature follows QT rather than local deterministic hidden-variable theories a la EPR. This case at least is closed, and we have to accept QT as the best theory we have today.
We agree that QM weirdness can't be explained by physicists ignorance of hidden variables, as per the Bell ansatz. Case is closed here in my view as well.

The problem with QM isn't the weirdness, i think the world is even more weird when we understand it better. This gets unavoidable one one considers unification. The QM is robust, because of the classical background. But we can't keep clining onto this fictional reference. It creates problems with renormalisation and fine tuning.

vanhees71 said:
For me the problem is that I don't understand which problem they want to solve.

vanhees71 said:
The really remaining problem is not a foundational, philosophical problem but a physical one, i.e., to find a quantum theory of gravitation with the hope to eliminate the inevitable singularities of the classical theory (General Relativity) we have now.
This is also my main motivation + the question for unification. I am not into philosophy itself (I never studied it as a separate subject and probably wont), or for it's own sake. But if you consider the search for a coherent framework, rather than a patchwork of separate theories philosophy, then I am guilty. I expect a theory to have a sound coherent line of reasoning that is a bit more than just empirics (after all that is what i think is the task of a theorist). Also I insist that the philosophy of science and knowledge would be inseparable from foundational question in modernt physics, especially as we are talking about information theoretic perspectives. I see no reason to be ashamed for bringing up the perhaps philosophical questions that is relevant for an information theoretic reconstruction of foundations of physics. Especially is this motivated by that we see already in current theories, hints of very deep and interesting connections, and I an convinced that this is not a conincidence I am confident that there IS yet a much deeper and more satisfactory explanation of how things related to each other, than what is currently in textbooks.

/Fredrik
 
  • #22
Sunil said:
The point of Bell's theorem was a philosophical one - to show that Einstein causality has to be given up if one wants to preserve a realistic worldview.
The problem for @vanhees71 is that he can't understand what "realistic" is supposed to mean in this context. That's because it is a philosophical term (meaning more or less the same as "ontic").
 
  • #23
Sunil said:
The point of Bell's theorem was a philosophical one - to show that Einstein causality has to be given up if one wants to preserve a realistic worldview. Falling back to the Lorentz ether is fine, it is what solves all the quantum foundational problems (as shown by the realist interpretations), and the costs are minimal. Einstein-causal hidden variables have to be given up, and there is no need for them once there are classical-causal hidden variables.
For me the breakthrough of Bell's work was that he transformed an until then vague philosophical question with apparently no scientific meaning to a well-defined mathematically formulated quantitative prediction of a very general class of "local deterministic hidden-variable theories", which contradicts the predictions of quantum mechanics concerning entangled states. The point is that none such hidden-variable theory a la Einstein describes the observations correctly, while local (!) relativistic QFT does. So there is no need for a new theory beyond Q(F)T concerning this issue of "inseparability", which was the real quibble Einstein had with QT (and not the very vague and unclear elaboration of the EPR paper, where the argument is "swamped by erudition" [freely quoted from Einstein]).
Sunil said:
The problem solved is preservation of realism and causality (strong, reasonable causality, with common cause principle, not weak signal causality). Both are strong guiding principles for science in general, one can interpret them as part of the logic of science, rejecting them has similar catastrophic properties as superdeterminism, observing correlations becomes meaningless because correlations no longer require realistic explanations, and we can go back to astrology.
I don't see, which problem was solved, because there was none to begin with, because standard Q(F)T was the correct description all the time. As any physical theory Q(F)T is of coarse causal, and it is causal by construction. That's why only massive and massless representations of the Poincare group are realized and the causality principle is realized using the microcausality principle underlying local relativistic QFTs. The correlations, including correlations between space-like separated measurement events, described by entanglement, are consistent with local relativistic QFTs, so that there is no problem with relativistic causality.
Sunil said:
This problem remains invisible to the opponents simply because they apply those rules as usual everywhere except in discussions about the consequences of Bell's theorem.
I use only the minimal standard interpretation of local relativistic QFT to discuss the consequences of Bell's theorem. That's the beauty of this result: everything is consistent within standard relativistic QFT, and everything observed is correctly predicted by this very theory.
Sunil said:
We know how to quantize GR as an effective field theory, based on the field-theoretic version of GR (which has a flat background). We know how to regularize field theories in such a way that they become finite - with lattice regularizations. So I don't know where the problem is.

Except for the religious metaphysical point that a background is evil.
I'm not an expert about quantum gravity. So I can't answer this question, but I think a flat background is quite a restriction given our cosmological standard model based on the FLRW and not flat background spacetime.
 
  • #24
Fra said:
Are you suggesting that your thermal interpretation is the correspondence of the manifold in QM, in line with Fuchs thinking?
Yes, though what I found is not in the direction in which Fuchs seemed to look for in his paper. And of course, the word "manifold" is just a metaphor for whatever survives after the observer-dependence has been removed.

vanhees71 said:
The information-theoretical approach to (not only) quantum physics gets more and more confirmed.
My view preserves all of quantum information theory while removing its subjective aspects.
 
Last edited:
  • Like
Likes gentzen, vanhees71 and Fra
  • #25
If I understand you right before regarding your interpretation, one of your motivators was to question the physical justification of the statistical tools used, by an actual observer, to produce expectations for single outcome?

A. Neumaier @ arxiv.org/abs/2110.05294v2 said:
Thus by shifting the attention from the microscopic structure to the experimentally accessible
macroscopic equipment (sources, detectors, filters, and instruments) we have gotten rid of all
potentially subjective elements of quantum theory.
Let me know if I understand you approximately right - your conceptual solution=
?
Seems to be to essentialy consider the whole classical environment (with all the the macroscopic pointers etc) as an objective information sink, from which any quantum tomography is taking place.

And all the information available in the whole environment is the basis for and supposedly justifies the statistical expectations.

But to justify how and observer (that obviously does not have ALL this information), it can in principle inform itself from the environment via "classical communication"? which is a rich "heat bath" of information, and maybe thereof your name "thermal interpretation"??
?

Note: To rely on the classical environment seems reasonably in line with the traditional interpretation of Bohr? (More so than than that of Heisenberg). Except the quantum tomography aims to make things a bit more explicit. In the old writings of Bohr, I read the essence to be the same, where objectivity was ensured by consensus forming in the "classical sink", except it was mainly philosophy, the quantum tomophraphy is trying to make this more mathematical.

/Fredrik
 
  • #26
Morbert said:
I'm not sure I understand this. Are you saying the average researcher implicitly interprets QM in a realist manner, until the moment Bell inequalities comes up?
The average researcher interprets QM as "shut up and calculate", without thinking too much about philosophy. Only a small minority cares about Bell. And those who care, and follow the mainstream (that means, giving up realism and causality with common cause principle) forget about this outside quantum theory. Inside quantum theory they may restrict themselves to Copenhagen and shut up and calculate. By the way, Copenhagen can be seen as a recommendation how to do it: There is the classical part, and in this classical part everything is fine, you can use trajectories, realism, common cause principle and so on. All the mystery is localized in the quantum part.
vanhees71 said:
For me the breakthrough of Bell's work was that he transformed an until then vague philosophical question with apparently no scientific meaning to a well-defined mathematically formulated quantitative prediction of a very general class of "local deterministic hidden-variable theories", which contradicts the predictions of quantum mechanics concerning entangled states. The point is that none such hidden-variable theory a la Einstein describes the observations correctly, while local (!) relativistic QFT does.
That was fine, it destroyed a wrong path to realistic causal interpretation.
vanhees71 said:
I don't see, which problem was solved, because there was none to begin with, because standard Q(F)T was the correct description all the time. As any physical theory Q(F)T is of coarse causal, and it is causal by construction.
Emphasis mine. That you name QFT "local" and "causal" is self-deception. You have thrown away the most important principle of causality, the common cause principle. To name the cheap remains "causality" suggests you that what you have thrown away is not worth much, given that this is just a (slightly) weaker form of causality (for those who have recognized at least that something has been thrown away). Problem-solving by extending the meaning of "causality".

It is not your personal self-deception, but shared by the mainstream, but your post was a nice example to demonstrate how this self-deception works.
vanhees71 said:
I'm not an expert about quantum gravity. So I can't answer this question, but I think a flat background is quite a restriction given our cosmological standard model based on the FLRW and not flat background spacetime.
In Schmelzer's Lorentz ether the preferred coordinates are harmonic. In a spatially flat FLRW universe the comoving Cartesian spatial coordinates are harmonic. No problem. Ok, this works only for a flat universe. But our universe is flat.
vanhees71 said:
That's why only massive and massless representations of the Poincare group are realized and the causality principle is realized using the microcausality principle underlying local relativistic QFTs. The correlations, including correlations between space-like separated measurement events, described by entanglement, are consistent with local relativistic QFTs, so that there is no problem with relativistic causality.
I use only the minimal standard interpretation of local relativistic QFT to discuss the consequences of Bell's theorem. That's the beauty of this result: everything is consistent within standard relativistic QFT, and everything observed is correctly predicted by this very theory.
The realistic and causal interpretations of QFT which use the Lorentz ether can make the same claim. They have to use the EEP, but in Schmelzer's Lorentz ether the EEP is derived, it follows there from the action equals reaction symmetry: The equation for the preferred coordinates, the harmonic condition, depends only on the gravitational field, not on matter fields. So the equation for the matter fields does not depend on the preferred coordinates.
 
  • Like
Likes gentzen
  • #27
Sunil said:
We know how to quantize GR as an effective field theory, based on the field-theoretic version of GR (which has a flat background). We know how to regularize field theories in such a way that they become finite - with lattice regularizations. So I don't know where the problem is.

Except for the religious metaphysical point that a background is evil.
While I can connect to some other things you write, such as the desired for causal mechanism (hidden or not), here we disagree.

The issue here for me is much more than just trying to find a way to cure divergences in formal expressions, when applying some quantization prescription to the gravitational field.

The expected unification energy is much higher than the GUT, so the QG problem is not only about the low energy side with cosmology it also has to do with unification and problems of fine tuning. I expect to find more explanatory value if we can find how to reconstruct interactions. Perhaps some may call this metaphysics, but in my view, at the TOE scale, the prerequisites of sufficiently complex and stable angents (matter) that can encode the relations will emerge as spacetime, isn't existing.

Thus in my view, any program that simply presumes the existence of 4D spacetime, will have limited the possible explanatory value that can possible come out of such a research program. From my perspective, it seems conceptually clear that to understand the universal attraction than is responsbile for spacetime and inertia is unlikely to be well understood if start by some parts that begs explanation.

/Fredrik
 
  • #28
Fra said:
Thus in my view, any program that simply presumes the existence of 4D spacetime, will have limited the possible explanatory value that can possible come out of such a research program. From my perspective, it seems conceptually clear that to understand the universal attraction than is responsbile for spacetime and inertia is unlikely to be well understood if start by some parts that begs explanation.
It is completely legitimate to develop theories which do not presuppose a 4D background. If they succeed to explain more, for example, explain why our spacetime is 3+1 dimensional, fine.

But it is also completely legitimate to start with a classical fixed background of absolute space and time. This does in no way prevent explanatory power for the gravitational field. In Schmelzer's general Lorentz ether, the gravitational field is defined by the Noether stress-energy-momentum tensor of the background. This can be seen as an explanation of the universality of gravity - everything influences, with its own energy and momentum, the stress-energy-momentum tensor of everything.

So there has to be some competition of which approaches have more explanatory power - those which accept relativistic symmetry as fundamental, or those which presuppose something else and derive relativistic symmetry only as approximate, emergent. Instead of rejecting some approaches simply based on metaphysical prejudices about the fundamentality of relativistic symmetry.
 
  • Like
Likes gentzen and Fra
  • #29
Sunil said:
The average researcher interprets QM as "shut up and calculate", without thinking too much about philosophy. [...] Inside quantum theory they may restrict themselves to Copenhagen and shut up and calculate.

This is a conventional position, but I sometimes wonder about how true this is, and I think there is a happy medium between "shut up and calculate" and a thoroughly intelligible primitive ontology, at least in my field.

I measure the electrical characteristics of field-effect transistors, and predict these measurements with quantum theory. But quantum theory does not merely let me compute the expected electrical characteristics, it let's me explain the electrical characteristics in terms of quantities like local densities of states, phonon-mediated tunnelling rates, position-resolved current densities etc. I build a picture in my head of events taking place in the FET, and this picture informs device design. (I can understand how e.g. a specific LDoS profile will give rise to favourable electrical characteristics, and I can select materials or geometries that will reproduce this LDoS)

I "shut up and calculate" these unobserved quantities in the sense that I do not ground them in some primitive ontology of beables. But they still do explanatory legwork, and ultimately if we want to argue that a formal realism can contribute to a field, we have to show that an array of beables is a more useful explanatory toolset.
 
Last edited:
  • Like
Likes gentzen and vanhees71
  • #30
Morbert said:
I "shut up and calculate" these unobserved quantities in the sense that I do not ground them in some primitive ontology of beables. But they still do explanatory legwork, and ultimately if we want to argue that a formal realism can contribute to a field, we have to show that an array of beables is a more useful explanatory toolset.
I understand this point, and your description of how you behave fits in what I think happens. I would describe it as "you use your common sense realism and don't care about consistency with some 'formal realism'."

But I think it is wrong to ask that 'formal realism' for some separate contribution. It contributes consistency, and consistency should have a very high value - it is already part of logic. But it certainly not a typical feature of human reasoning. It is part of very high level reasoning, does not have a deep base in evolution (if something happens, better react following the deep, animal, gut instincts immediately instead of deciding if such behavior would be really consistent with your philosophical ideas). But in mathematics, consistency is basic and fundamental, and in physics it should be too.

Moderator's note: This post has been edited to remove a portion that has been spun off into a separate thread:
https://www.physicsforums.com/threads/bayesian-statistics-in-science.1008801/
 
Last edited by a moderator:
  • Like
Likes Fra
  • #31
Fra said:
If I understand you right before regarding your interpretation, one of your motivators was to question the physical justification of the statistical tools used, by an actual observer, to produce expectations for single outcome?
The statistical tools are impeccable. Instead I question the interpretation of the state.
Fra said:
Let me know if I understand you approximately right - your conceptual solution?
Seems to be to essentially consider the whole classical environment (with all the the macroscopic pointers etc) as an objective information sink, from which any quantum tomography is taking place.
No. I am not talking at all about information, except to contrast it with the subjective approach.
 
  • #32
A. Neumaier said:
The statistical tools are impeccable. Instead I question the interpretation of the state.
Ok, I certainly don't mean that the mathematical tools was questioned withing the field of mathematics. In my phrasings the "justification of the tools", is the "interpretation" I guess, ie. to justify why those tools are the RIGHT tools, for the physical situation?

A. Neumaier said:
No. I am not talking at all about information, except to contrast it with the subjective approach.
In your terminolgoy, does reading off experimentally accessible macroscopic equipment (sources, detectors, filters, and instruments) not counts as gathering "information"? If we label it "objective, classical information", rather than "quantum information" does that make it better?

/Fredrik
 
Last edited:
  • #33
Fra said:
Ok, I certainly mean that the mathematical tools was questioned withing the field of mathematics. In my phrasings the "justification of the tools", is the "interpretation" I guess, ie. to justify why those tools are the RIGHT tools, for the physical situation?
Statistics is indeed the right tool for handling aleatoric uncertainty in physics.
Fra said:
In your terminolgoy, does reading off experimentally accessible macroscopic equipment (sources, detectors, filters, and instruments) not counts as gathering "information"? If we label it "objective, classical information", rather than "quantum information" does that make it better?
It counts as gathering information in the colloquial sense, but not in the sense of information theory, which quantifies missing information rather than knowledge.

I don't know what 'quantum information' is. Quantum information theory is not about some mysterious 'quantum information' but about quantum state estimation and manipulation.
 
  • #34
A. Neumaier said:
Statistics is indeed the right tool for handling aleatoric uncertainty in physics.
Here we may reach difficulties as I think we have taken on different tasks here. Which statistics? The task of general inference (including abducing the best laws) is much more than just "objective frequentist statistics" in my view. One has to ponder about things like information processing resource constraints. As no real observer/agent can simply or access store all raw data som some faithful form, from from that make "objective" inferences. The subjecitivty of inferences are not just due to having incomplete data, it may also be due to gaming elements, such as randomly trying certain rules. Also encoding data, may involve lossy compressions, and the lossy compressions have to be choosen from what is likely to be most beneficial for future predictions. In this soup, all things that I think you consider the model must evolve. I agree that fitting too many parameters at once as once will not work, this is part of the information contstrain considerations for me. It's also why I think neith the state space nor laws (such as hamiltoinans) can be trated as statics, with predtermined parameters. The hamiltonians is one of the things that begs explanation.

I do not have the answers, I just start raising questions that I think are important to foundations and the open questions.

A. Neumaier said:
It counts as gathering information in the colloquial sense, but not in the sense of information theory, which quantifies missing information rather than knowledge.
Ok, good, then I think I understand. I wasn't referring to just such "ignorance" that are from macro/micro levels. What counts for me is the agents real "predictive power" and fitness in a given environment, which is influenced both by knowledge and missing information.

/Fredrik
 
  • #35
Fra said:
As no real observer/agent can simply store or access all raw data in some faithful form, from that make "objective" inferences.
The only agent that matters in an objective approach to physics is the scientific community as a whole. This community has access to all data (available at a given time) and makes decisions on what is true based on accepted statistical reasoning. This makes it objective to the extent that science can be objective.
 
  • Like
Likes vanhees71
Back
Top