Nature Physics on quantum foundations

In summary: Already the 1st paragraph tells me why the philosophical part of what they call "quantum foundations" really is pretty questionable.
  • #71
There's indeed no need for any collapse postulate in an ontological sense. Berthold-Georg Englert, Marlan O. Scully and Herbert Walther in “Quantum erasure in double-slit interferometers with which-way detectors” (American Journal of Physics, 1999):

We recall: The state vector ##|\Psi>(x)## serves the sole purpose of summarizing concisely our knowledge about the entangled atom-and-photon system; in conjunction with the known dynamics, it enables us to make correct predictions about the statistical properties of future measurements. And a state reduction must be performed whenever we wish to account for newly acquired information about the system. This minimalistic interpretation of state vectors and their reduction is common to all interpretations; it is forced upon us by the abundance of empirical facts that show that quantum mechanics works.
Of course, one might try to go beyond the minimalistic interpretation and give additional ontological meaning to
##|\Psi>(x)##, thereby accommodating some philosophical preconceptions or other personal biases. In doing so, one should however remember van Kampen’s caveat: Whoever endows the state vector with more meaning than is needed for computing observable phenomena is responsible for the consequences (Theorem IV in Ref. 7).” [bold by LJ]
 
  • Like
Likes vanhees71 and bhobba
Physics news on Phys.org
  • #72
Lord Jestocost said:
There's indeed no need for any collapse postulate in an ontological sense.
...
We recall: The state vector ##|\Psi>(x)## serves the sole purpose of summarizing concisely our knowledge
...
And a state reduction must be performed whenever we wish to account for newly acquired information about the system. This minimalistic interpretation of state vectors and their reduction is common to all interpretations; it is forced upon us by the abundance of empirical facts that show that quantum mechanics works.
...
Of course, one might try to go beyond the minimalistic interpretation and give additional ontological meaning
...
more meaning than is needed for computing observable phenomena
This presumes that the CONTEXT of the computation (ie. the computer hardware) lacks physical basis and thus ontology. ie that "information processing" is not a physical phenomenon, but a human endavour. Even if it makes sense to think of the quantum STATE as non-physical, the state itself is defined by the state of the encoding context.

So in order to compute something, the algorithm is not sufficient, one also need a processing device to run it on.

In the Copenhagen intepretation, this "hardware" is in the effectively classical environment. In other views it may correspond to the shift in the state of the agent (which is part of the environment, anyway). So I think the ontology of the collapse refers to the ontology of the coding context (environment or agent), not the system itself. But this is also important ontology IMO; which is one of the reasons for my own choice of interpretation. Denying the "ontology of the observer" just does not make sense.

/Fredrik
 
  • #73
Lord Jestocost said:
There's indeed no need for any collapse postulate in an ontological sense.

Of course. The Quantum state, especially when you consider Gleason, just allows the calculation of probabilities. It is an aid mathematically required by modelling the results of observations as the eigenvalues of an operator (yes, non-contextuality and a few others like the strong law of superposition are required, but they all are quite intuitive). It is like medium-term climate models that face the same issue as longer-term ones. Due to chaos, all you can do is predict the probabilities of, say, a cyclone forming in the pacific ocean hitting a particular city, such as where I live in Brisbane. You wake up one morning, open the window and see it did not hit where you live. What collapsed there? There is a genuine issue with the state, as elucidated by the PBR theorem, but it requires its own thread.

Thanks
Bill
 
Last edited:
  • Like
Likes Peter Morgan
  • #74
vanhees71 said:
There's no need for any collapse postulate either.
Of course, everything can be done without collapse, but from a practical point of view it makes things more complicated. The collapse postulate is the most efficient way to do the job. For example, books on quantum computation for engineers all use the collapse postulate.
 
  • #75
vanhees71 said:
Science is a mixture of engineering (preparation and measurements in the lab) and math (theory/model building by a theorist behind his or her desk
No, because math is a part of engineering too.
 
  • #76
Lord Jestocost said:
There's indeed no need for any collapse postulate in an ontological sense. Berthold-Georg Englert, Marlan O. Scully and Herbert Walther in “Quantum erasure in double-slit interferometers with which-way detectors” (American Journal of Physics, 1999):

We recall: The state vector ##|\Psi>(x)## serves the sole purpose of summarizing concisely our knowledge about the entangled atom-and-photon system; in conjunction with the known dynamics, it enables us to make correct predictions about the statistical properties of future measurements. And a state reduction must be performed whenever we wish to account for newly acquired information about the system. This minimalistic interpretation of state vectors and their reduction is common to all interpretations; it is forced upon us by the abundance of empirical facts that show that quantum mechanics works.
Of course, one might try to go beyond the minimalistic interpretation and give additional ontological meaning to
##|\Psi>(x)##, thereby accommodating some philosophical preconceptions or other personal biases. In doing so, one should however remember van Kampen’s caveat: Whoever endows the state vector with more meaning than is needed for computing observable phenomena is responsible for the consequences (Theorem IV in Ref. 7).” [bold by LJ]
Although I love all papers by any of those authors, one must stress that the notation ##|\Psi \rangle(x)## is an utmost serious sin! ##|\Psi \rangle## is a normalized vector in Hilbert space, describing the pure state ##\hat{\rho}=|\Psi \rangle \langle \Psi |##. The wave function is its representation as (generalized) components of (generalized) position eigenvectors, i.e., ##\Psi(x)=\langle x|\Psi \rangle##. Otherwise the quoted statement is, of course, completely right.
 
  • Wow
  • Informative
Likes Peter Morgan, Demystifier and Lord Jestocost
  • #77
vanhees71 said:
the notation ##|\Psi \rangle(x)## is an utmost serious sin!
Indeed, terrible notation!
 
  • Haha
Likes Peter Morgan
  • #78
bhobba said:
Of course. The Quantum state, especially when you consider Gleason, just allows the calculation of probabilities. It is an aid mathematically required by modelling the results of observations as the eigenvalues of an operator (yes, non-contextuality and a few others like the strong law of superposition are required, but they all are quite intuitive). It is like medium-term climate models that face the same issue as longer-term ones. Due to chaos, all you can do is predict the probabilities of, say, a cyclone forming in the pacific ocean hitting a particular city, such as where I live in Brisbane. You wake up one morning, open the window and see it did not hit where you live. What collapsed there? There is a genuine issue with the state, as elucidated by the PBR theorem, but it requires its own thread.

Thanks
Bill
May I emphasize, @bhobba, that it's A quantum state that "just allows the calculation of probabilities", not "The quantum state"? In practice, we use a different Hilbert space for every different kind of experiment, not just a different state as a model for a given state preparation. It looks from your writing here that you might be open to that difference, but to me there's a pragmatic emphasis in saying "A" for which I think it's worth taking the trouble. If you think this is a difference you think is irrelevant, I'd like to know what your reasons are because I consider this to be fundamental to my choice of title for my article "The collapse of a quantum state as a joint probability construction" (no emphasis in the original, though I'd like there to have been). The question of the Heisenberg cut, for example, can be thought to be about what Hilbert space we "ought" to use to describe an experiment, and hence what quantum states and measurement operators are to be considered candidate models for our apparatus, with no evidently correct answer except for tractability.
An aspect of quantum (field) theory that has come to seem strange to me is that because measurements at time-like (or light-like) separation in general do not commute, the straightforward calculation of joint probabilities at time-like separation is not possible. Instead, we compute joint probabilities at time-like separation by computing the probability of the earlier measurement results, collapsing the state, and then computing the probability of the later measurement results in that new state. That rigamarole gives us a joint probability, whereas if we allowed ourselves to use operators that commute at time-like separation, which I claim is a conventional choice, we could obtain a joint probability without a collapse.
I think the later comments you make about chaos are well-observed, but I point out that the academic literature on chaos theory increasingly uses Koopman's Hilbert space formalism. It's interesting that that literature makes only very limited use of noncommutativity (mostly only for what's called the Koopman operator, which is an integrated form of the Liouvillian operator), which I suppose may be because the people who work on chaos theory have a relatively classical mindset and may even be deliberately avoiding the questions raised by quantum measurement theory, but that doesn't have to be so. Noncommutativity can be used in CM as freely as it is used in QM if we decide to allow it. I won't elaborate on it here, but there is a good mathematical reason to use quantum mechanics: it has a mathematically very useful analyticity in its description of the dynamics of probabilities that classical mechanics does not have.
I've never really understood what gets a new thread on PF. Conversations often go away from the starting point so fast it blows my mind. I can't say I help keep things on track with my own preoccupations:frown: The starting point here, "Nature Physics on quantum foundations", initially attracted me because I was annoyed by that editorial enough that I noticed when it came up on PF.
 
  • Like
Likes mattt and Fra
  • #79
Concerning the original post, "Nature Physics on quantum foundations", I was annoyed enough by that editorial that I wrote to nature physics that I felt they don't understand the foundations of QM literature at all if they could begin with "Quantum physics is weird". On September 9th, I posted the following to my Facebook page, which for better or worse is what I mostly use for my whiteboarding these days (you can be thankful I mostly don't use PF, right?),

I'm amazed by this, an editorial in Nature Physics, https://rdcu.be/cVglj (that link should give access to the article, which as a DOI link is https://doi.org/10.1038/s41567-022-01766-x ). It begins with "Quantum physics is weird", which any physicist who is involved in the recent literature ought to know is now enough in question that we are into a new era, so much so that even a popular book from Philip Ball, reflecting that literature, is titled "Beyond Weird".​
Here's the paragraph before last [of the editorial in Nature Physics],​
"Although a fresh view can invigorate any field, much of this work also manifests a disregard for the progress that has been made since quantum mechanics was established. The quantum foundations literature is the product of decades of careful thought about the issues involved in understanding and interpreting the physical world. As with any topic, a failure to constructively engage with existing work runs the risk of repeating earlier mistakes and misunderstandings."​
I entirely agree that there has been remarkable progress over the last Century. I leave as an exercise, however, for anyone who has followed my published work of the last few years, "What literature have they failed to engage with?!?" I can forgive the editors of Nature Physics because I know very well that my work is not perfectly clear and needs another few iterations in the literature (and in their last paragraph they almost save themselves, "the maxim “no one understands quantum mechanics” is a little less true than it used to be, at least in a practical sense", so they almost know this is a "clouds on the horizon" editorial), but they do not know the recent literature on QM well enough —and not just mine— to write this editorial.​

At the same time, I wrote to naturephysics@nature.com with a similar degree of intemperateness because I didn't care whether I got a reply (and it didn't), which I followed up with an e-mail to one person on the editorial team in particular, Bart Verberck, because I suspected, with no real evidence, that it was his hand at the first writing of the editorial. Also no reply. I followed up that second e-mail a few days ago, because why not, which, astonishingly to me, elicited a brief reply from the chief editor of Nature Physics, yesterday. The last sentence of that reply is of general interest, I think, and to the credit of the journal,
"But I can re-emphasize that the spirit of the editorial – that quantum foundations is an important area of study – is something that we believe rather strongly and we certainly hope to represent it in our pages in the future."​
The first short paragraph of the reply was a polite, rather nicely phrased setting aside of my work. I had forced them to respond to persuade me to go away —which they didn't need to do because there are other places where I can annoy people more productively, so I wasn't going to write a fourth time— but they didn't have time to read my rather obtuse work with enough care to figure out whether it contains something worthwhile.
The social aspects of this are somewhat incomprehensible to me: if there is something transformative in my work (or, to be clear, in someone else's) that really hasn't been said in either the ancient or recent literature on QM, what does it take to persuade people to read it? Three published articles in recent years in Physica Scripta, Annals of Physics, and Journal of Physics A are not enough, but what is? I don't want to over-claim, because I know that my work is less than perfectly clear and in any case I may just be as wrong as the worst crank, but I can see how my work fits into so many threads in the physics literature that I feel confident there is some good in my work, even though I must be wrong about many details (I always have been in the past and it's usually taken me a few years to see why I've been wrong about even tiny details.) My work is research in the raw about something that has resisted much smarter people than I am for most of a Century, so for even the smallest of good to come from my work is unbelievable enough that I can't find it in myself to fault Nature Physics. Fun, yeah.
 
  • #80
Peter Morgan said:
May I emphasize, @bhobba, that it's A quantum state that "just allows the calculation of probabilities", not "The quantum state"? In practice, we use a different Hilbert space for every different kind of experiment, not just a different state as a model for a given state preparation. It looks from your writing here that you might be open to that difference, but to me there's a pragmatic emphasis in saying "A" for which I think it's worth taking the trouble. If you think this is a difference you think is irrelevant, I'd like to know what your reasons are because I consider this to be fundamental to my choice of title for my article "The collapse of a quantum state as a joint probability construction" (no emphasis in the original, though I'd like there to have been). The question of the Heisenberg cut, for example, can be thought to be about what Hilbert space we "ought" to use to describe an experiment, and hence what quantum states and measurement operators are to be considered candidate models for our apparatus, with no evidently correct answer except for tractability.
Good and rarely expressed point! You caught my interest, and I symphatise with some interesting fragments in your your paper from post 63, so I will try to read it. I might not be aware of your previous papers although your name seems familiar, but I might mixed it up with a math teacher i had in an ODE class, he worked on mathematical physics as well but it must be someone else... The non-commutative way to codign things is a key, but i will read your paper before rambling. If I have any comments, i might start a separate thread.

/Fredrik
 
  • #81
Peter Morgan said:
Concerning the original post, "Nature Physics on quantum foundations", I was annoyed enough by that editorial that I wrote to nature physics that I felt they don't understand the foundations of QM literature at all if they could begin with "Quantum physics is weird".

Peter Shor:

“Quantum mechanics is really strange, and I don’t think there’s ever going to be any easy way of understanding it"

...
 
  • Haha
  • Skeptical
Likes bhobba and Peter Morgan
  • #82
It's only strange if you don't accept that Nature behaves differently to our intuition, trained by being immersed in a "classical macroscopic world". In fact it's a very successful description of all (known) matter from the elementary building blocks (as far as we know them), quarks and leptons, gauge bosons, and the Higgs boson, via hadrons, atomic nuclei, atoms, molecules, and condensed matter.

The big enigma is, of course, a satisfactory quantum description of the gravitational interaction.
 
  • Like
  • Love
Likes Fra, bhobba and Peter Morgan
  • #83
vanhees71 said:
It's only strange if you don't accept that Nature behaves differently to our intuition, trained by being immersed in a "classical macroscopic world". In fact it's a very successful description of all (known) matter from the elementary building blocks (as far as we know them), quarks and leptons, gauge bosons, and the Higgs boson, via hadrons, atomic nuclei, atoms, molecules, and condensed matter.

The big enigma is, of course, a satisfactory quantum description of the gravitational interaction.
There is something more to it though. Relativity is also different to our intuition, but far less physicists complain about it or work on the foundations/interpretations of it.
 
  • Like
Likes physika
  • #84
I think the reason is that classical (i.e., non-quantum) physics can be formulated relativistically too, and there is no indeterminism in it. The main quibble of QT for the older generation of physicists was that one has to give up determinism, while a change of the space-time model seemed not that revolutionary for physicists. On the other hand for some philosophers relativity (particularly the new notion of time, relativity of simultaneity, etc.) was as inacceptable as the indeterminism of for some of the physicists. That's the reason, why Einstein explicitly did not get the Nobel prize for relativity, because for the Nobel committee the quibbles of some philosophers with the relativistic notion of time (particularly Bergson) were too severe to give a Nobel prize for such a "weird theory of time". Rather they quoted Einstein's work on the photoelectric effect, i.e., "old quantum theory" as the prize-worthy work, which is ironic since this is the only one of the famous three topics of Einstein's miracle year 1905, which is not up to date anymore (not to mention that Einstein's greatest work for sure is his general relativity theory).
 
  • Like
Likes bhobba, Peter Morgan and Lord Jestocost
  • #85
vanhees71 said:
I think the reason is that classical (i.e., non-quantum) physics can be formulated relativistically too, and there is no indeterminism in it.

It is worth quoting Misner et al. on the ultimate breakdown of spacetime (in “Gravitation” by Charles W. Misner, Kip S. Thorne, John Archibald Wheeler, 1973 pp. 1182–1183):

The uncertainty principle thus deprives one of any way whatsoever to predict, or even to give meaning to, 'the deterministic classical history of space evolving in time.' No prediction of spacetime, therefore no meaning for spacetime, is the verdict of the quantum principle. That object which is central to all of classical general relativity, the four-dimensional spacetime geometry, simply does not exist, except in a classical approximation.
 
  • Like
Likes vanhees71
  • #86
vanhees71 said:
It's only strange if you don't accept that Nature behaves differently to our intuition, trained by being immersed in a "classical macroscopic world". In fact it's a very successful description of all (known) matter from the elementary building blocks (as far as we know them), quarks and leptons, gauge bosons, and the Higgs boson, via hadrons, atomic nuclei, atoms, molecules, and condensed matter.

The big enigma is, of course, a satisfactory quantum description of the gravitational interaction.
If we have CM+ in hand as well as QM and understand the relationship between QM and CM+ better than we understand the relationship between QM and CM, I hope that reduces the distance between QM/CM+ and GR enough to make a difference.
I think we also have to understand interacting QFTs better for us to be able to make progress with GR, which I address in a so far unpublished paper on arXiv, https://arxiv.org/abs/2109.04412, which currently has the title "A source fragmentation approach to interacting quantum field theory". I'm considering changing the title to "Interacting quantum fields as an inverse problem", however, because instead of modeling interactions as happening everywhere between point-like measurements, we encode the effects of the space between measurements by preprocessing the test functions that describe how the different measurements are "focused", because measurements are never at a point, so that we can then use those preprocessed "fragments" with non-interacting fields between the measurements to compute the n-measurement "unfocused" Wightman functions. All of which is possible because of the Reeh-Schlieder theorem and an analysis of renormalization that I have not seen elsewhere.
vanhees71 said:
I think the reason is that classical (i.e., non-quantum) physics can be formulated relativistically too, and there is no indeterminism in it. The main quibble of QT for the older generation of physicists was that one has to give up determinism, while a change of the space-time model seemed not that revolutionary for physicists.
My view has become that CM is only deterministic if almost all degrees of freedom are included in a model. That seems to me never to be the case, all the more so if the dynamics between the included and excluded degrees of freedom is chaotic. Consequently, I think we have almost no choice but to work with a statistical state and with measurement devices that are either included or excluded in the model.
As soon as we work with statistics and probabilities, Boole already knew in 1854 that sometimes a pair of relative frequencies do not admit a joint relative frequency that has that pair as marginals. With the benefit of hindsight, we can call such a pair of relative frequencies "incompatible", which can be understood to be the mathematics that underlies noncommutativity in QM, but 70 years earlier than QM. To cut a long story short, there are practical benefits to working with incompatible relative frequencies using probability measures, characteristic functions, and Hilbert space methods.
Philosophically, I feel that we can never know that we have included all degrees of freedom, even if we have, because we can never be sure that there isn't something happening at smaller sales than we have so far investigated. This is what seems to me an unanswerable question: "Are there turtles all the way down?" If there are different kinds of turtles at every scale and all the way down, then our dynamical models will never include all of the details and we run into some severe mathematical problems because we have to introduce the axiom of choice to fix the initial conditions and there's no guarantee that anything will be continuous or differentiable, pretty much wiping out predictability for classically understandable reasons. That doesn't have to stop us, but if that is the way the world is then I think we have to work with models that are constructed top-down. That seems to me to put us in the realm of thermodynamics more than of statistical mechanics.
 
  • Like
Likes vanhees71 and gentzen
  • #87
vanhees71 said:
It's only strange if you don't accept that Nature behaves differently to our intuition, trained by being immersed in a "classical macroscopic world". In fact it's a very successful description of all (known) matter from the elementary building blocks (as far as we know them), quarks and leptons, gauge bosons, and the Higgs boson, via hadrons, atomic nuclei, atoms, molecules, and condensed matter.

The big enigma is, of course, a satisfactory quantum description of the gravitational interaction.

Better to say that we still don't understand nature, and QM is just an approximation, and maybe it can be superseded by another broader and deeper theory or model..
 
Last edited:
  • Like
Likes vanhees71
  • #88
physika said:
Better to say that we still don't understand nature, and QM is just an approximation, and maybe it can be superseded by another broader and deeper theory or model.

In that sense, we do not understand anything. They are just models valid in a certain context. Classical Mechanics is a limiting case of QM. QM likely will be supplanted by something more general eventually - but nobody knows. It doesn't matter what replaces it will face the same issue. It's the nature of science, as Feynman often pointed out. If we find a theory of everything, that would be great, but if it is like peeling layers of an onion, that is just as interesting and equally as great. I believe there is some profound symmetry principle lying at the foundation of the world - but that is just a belief - nature is as nature is. The enterprise of science is finding out more about it.

Thanks
Bill
 
  • Like
Likes vanhees71 and Lord Jestocost
  • #89
Peter Morgan said:
If we have CM+ in hand as well as QM and understand the relationship between QM and CM+ better than we understand the relationship between QM and CM, I hope that reduces the distance between QM/CM+ and GR enough to make a difference.
I think we also have to understand interacting QFTs better for us to be able to make progress with GR, which I address in a so far unpublished paper on arXiv, https://arxiv.org/abs/2109.04412, which currently has the title "A source fragmentation approach to interacting quantum field theory". I'm considering changing the title to "Interacting quantum fields as an inverse problem", however, because instead of modeling interactions as happening everywhere between point-like measurements, we encode the effects of the space between measurements by preprocessing the test functions that describe how the different measurements are "focused", because measurements are never at a point, so that we can then use those preprocessed "fragments" with non-interacting fields between the measurements to compute the n-measurement "unfocused" Wightman functions. All of which is possible because of the Reeh-Schlieder theorem and an analysis of renormalization that I have not seen elsewhere.

My view has become that CM is only deterministic if almost all degrees of freedom are included in a model. That seems to me never to be the case, all the more so if the dynamics between the included and excluded degrees of freedom is chaotic. Consequently, I think we have almost no choice but to work with a statistical state and with measurement devices that are either included or excluded in the model.
As soon as we work with statistics and probabilities, Boole already knew in 1854 that sometimes a pair of relative frequencies do not admit a joint relative frequency that has that pair as marginals. With the benefit of hindsight, we can call such a pair of relative frequencies "incompatible", which can be understood to be the mathematics that underlies noncommutativity in QM, but 70 years earlier than QM. To cut a long story short, there are practical benefits to working with incompatible relative frequencies using probability measures, characteristic functions, and Hilbert space methods.
Philosophically, I feel that we can never know that we have included all degrees of freedom, even if we have, because we can never be sure that there isn't something happening at smaller sales than we have so far investigated. This is what seems to me an unanswerable question: "Are there turtles all the way down?" If there are different kinds of turtles at every scale and all the way down, then our dynamical models will never include all of the details and we run into some severe mathematical problems because we have to introduce the axiom of choice to fix the initial conditions and there's no guarantee that anything will be continuous or differentiable, pretty much wiping out predictability for classically understandable reasons. That doesn't have to stop us, but if that is the way the world is then I think we have to work with models that are constructed top-down. That seems to me to put us in the realm of thermodynamics more than of statistical mechanics.
I'm not so sure that an alternative or extended version of classical point-particle mechanics helps to get an idea of how to formulate a "quantum spacetime" or "quantum gravity". For that one needs a field theory. Point particles are ill-defined already in classical relativistic physics, as manifests itself in the notorious radiation-reaction problem for charged point particles in electromagnetic theory and the non-existence of interacting many-body point-particle systems.
 
  • Like
Likes bhobba and Peter Morgan
  • #90
vanhees71 said:
I'm not so sure that an alternative or extended version of classical point-particle mechanics helps to get an idea of how to formulate a "quantum spacetime" or "quantum gravity". For that one needs a field theory. Point particles are ill-defined already in classical relativistic physics, as manifests itself in the notorious radiation-reaction problem for charged point particles in electromagnetic theory and the non-existence of interacting many-body point-particle systems.
I'm not sure, either. As I say in the comment you quote from, it's more a hope. As far as I know, the steps I've taken to rethink the measurement problem are not a commonplace, (though there are a few precedents in Belavkin and Tsang&Caves, and I think there are strands in the literature that have been working towards the steps I have taken, particularly in work on Koopman's Hilbert space formalism for classical mechanics), and the steps I've taken to rethink the renormalization "problem" (yeah, I know many physicists think the RG means there isn't a problem) are completely new.

As always, any step taken can turn out to be less worthwhile than its enthusiasts think it is, but any step taken can also allow other steps, either next week or in 50 years, after other developments. Koopman's work on Hilbert space formalism for CM dates from 1931 and had almost zero effect for decades, except that von Neumann and Birkhoff proved two different versions of the ergodic theorem using Koopman's idea, but, for me at least, reading his work through modern literature on quantum measurement theory, noncommutative probability, quantum field theory, et cetera, gives a perspective that is transformative. A few people find my telling of this compelling, but most physicists are not committing themselves. If this in fact transforms our ideas about physics over the next few years, then hesitating for just the right amount of time is rational, but hesitating too long will be a bad idea. If it doesn't, then hesitating was the right thing to do. Time will tell.

As to the field theory, I'm all about that. I don't do it well, but that's where I come from and where I'm going. The CM+ and measurement problem stuff came out of my long-term thinking about the relationship between QFT and random fields. You can see my "Bell inequalities for random fields" in JPhysA 2006, https://arxiv.org/abs/cond-mat/0403692 (DOI there), for example, which is little cited but I have still not seen a discussion of Bell's ideas about beables that better folds in QFT and classical noisy fields (for the latter, correlations at space-like separation are the norm at or near equilibrium, which invalidates Bell's reasoning in his "The theory of local beables" without having to introduce superdeterminism, constrained free will, and all that; I can only understand that this isn't a commonplace by now because everybody is so fixated on how photons are flying around their apparatuses, which seems like proof that field theory is not as much present as I think it should be in most people's thinking.)

I think your focus on the radiation-reaction problem is well-taken. When faced with an ill-defined mathematical model, a standard move is to use a dual construction instead. That's what I think the Wightman axioms do by working with "test functions" instead of with operator-valued distributions. I move further in that direction by working with an "inverse problem" approach. As I say above, I think the "fragmentation" language is OK but not good enough, but I often find that that kind of tiny shift of perspective can take me a year or two to find and get used to.
 
  • Like
Likes bhobba and vanhees71
  • #91
gentzen said:
I will elaborate it for Bosons, so that I can ignore the wavefunction for the equivalence relation.
Even for Fermions, I can just ignore the wavefunction for the equivalence relation. I realized this when the skeptic in me started to ponder over a "serious omission" in the given quotient construction:

If we interpret the trajectories as a function ##(x_1, \ldots, x_n)(t) : \mathbb R \to \mathbb R^{3n}## and consider "piecewise constant" permutations ##\pi(t):\mathbb R \to S_n##, then ##(x_{\pi(t)(1)}, \ldots, x_{\pi(t)(n)})(t)## is only "piecewise continuous". So it is not a "strong" solution of the guiding equation. Weakening the continuity requirements is possible (and needed, because otherwise uniqueness of solution together with continuity allows identification of particles between different times), but it feels very much like a "patchwork construction".

Turns out my attempt to illustrate the character of "unnatural" constructions as "patchwork constructions" in the initial reply to A. Neumaier failed to identify the crucial points. Glueing together the endpoints of a closed interval to get a circle is an "unnatural" construction, but taking an open interval and identifying two small open intervals at both ends (pointwise) with each other to get a circle is a not an "unnatural" construction, despite allowing patchwork. (And the "quotient by a discrete subgroup" fails even worse to identify the crucial points, "discrete" is neither necessary nor sufficient, and "subgroup" instead of "group" was superfluous.)

I guess the point of "natural quotient" constructions is rather that at least locally, the real work should already be finished before taking the quotient, so that it doesn't make a difference for "local topological" constructions whether they are applied before or after taking the quotient.

vanhees71 said:
Of course, it's much more efficient to simply not use any kind of trajectories as in the dBB interpretation. They do not provide anything physical to QT anyway. You may solve some philosophical quibble but introduce more complication without gaining any new insights from a scientific point of view.
Of course, it is completely unclear what "much more efficient" is supposed to mean in this context. The meaning of "anything physical" is clearer, but for me it is enough that dBB and quotient constructions provide "something mathematical".

The relationship between "mathematical constructions" and "philosophical quibbles" has always been a complicated one. Legend says that the Pythagoreans killed the one who discovered irrational numbers. And Zeno of Elea attacked the continuum on philosophical grounds. In both cases, the attacked concepts turned out to have hidden complexities and dangers, but the attacks themselves failed to clearly isolate those or show a way forward. Maybe philosophers are better at articulating hidden problems, than at helping to overcome them.

At least for me, reading philosophical texts is sometimes both fun and useful. SEP comes to mind, and also:
gentzen said:
..., then I read An Interpretive Introduction to Quantum Field Theory cover to cover. It was easy to read, with a good mix of technical details, explanations, interpretations, and philosophical clarification.
… much of the interpretive work Teller undertakes is to understand the relationship and possible differences between quantum field-theory — i.e., QFT as quantization of classical fields — and quantum-field theory — i.e., a field theory of ‘quanta’ which lack radical individuation, or as Teller says, “primitive thisness.”
Teller made quite some effort to help his reader grasp how radically different truly "indistinguishable particles" are compared to our everyday experience. Looking at them in dBB on the other hand shows you how they require (anti-)symmetric wavefunctions and some form of discontinuity. As always, the discontinuity required in dBB is "too nonlocal" compared to what you actually need. And of course, lessons from dBB apply primarily to non-relativistic QM.
 
  • #92
martinbn said:
There is something more to it though. Relativity is also different to our intuition, but far less physicists complain about it or work on the foundations/interpretations of it.
Here is why we don't have as much work on foundations/interpretations of relativity theory as we do on quantum mechanics per Zeilinger:
Physics in the 20th century is signified by the invention of the theories of special and general relativity and of quantum theory. Of these, both the special and the general theory of relativity are based on firm foundational principles, while quantum mechanics lacks such a principle to this day. By such a principle, I do not mean an axiomatic formalization of the mathematical foundations of quantum mechanics, but a foundational conceptual principle. In the case of the special theory, it is the Principle of Relativity, ... . In the case of the theory of general relativity, we have the Principle of Equivalences ... . Both foundational principles are very simple and intuitively clear. ...
I submit that it is because of the very existence of these fundamental principles and their general acceptance in the physics community that, at present, we do not have a significant debate on the interpretation of the theories of relativity. Indeed, the implications of relativity theory for our basic notions of space and time are broadly accepted.
 
  • Like
  • Love
Likes Lord Crc, bhobba, DrChinese and 3 others
  • #93
RUTA said:
Here is why we don't have as much work on foundations/interpretations of relativity theory as we do on quantum mechanics per Zeilinger:
I agree about the situation. But do we agree on the conclusion from it?

I think for example "reason" for WHY there is a observer invariant upper limit on communication speed is an important one to answer. Taking it as an axiom or emprical observation is I think not satisfactory. When deriving SR from axioms details of EM field or light has no distinguished role. Just the existence of an invariant common max speed(in 3D space) is enough. What is it with the construction or emergence of space that explains this? If we can answer that (and i think we shouldn try) then i think we will get many clues towards unifying GR and QM.

But i still agree its a value to (like you try to) to point to potential principal explanations of qm as well. But it does not give me peace of mind.

/Fredrik
 
  • #94
After all space is just an "index" of events that gives them relational structure. How is this index built and defined from the observer that distinguishes the events. If we start bt thinking of "classical pointers" don't we already with the word "classical" imply not only "macroscopic" but also the embedding in classical 3D space? How can one imagine "classical observer" prior to spacetime?

/Fredrik
 
  • #95
Fra said:
I agree about the situation. But do we agree on the conclusion from it?

I think for example "reason" for WHY there is a observer invariant upper limit on communication speed is an important one to answer. Taking it as an axiom or emprical observation is I think not satisfactory. When deriving SR from axioms details of EM field or light has no distinguished role. Just the existence of an invariant common max speed(in 3D space) is enough. What is it with the construction or emergence of space that explains this? If we can answer that (and i think we shouldn try) then i think we will get many clues towards unifying GR and QM.

But i still agree its a value to (like you try to) to point to potential principal explanations of qm as well. But it does not give me peace of mind.

/Fredrik
In principle explanation, the empirically-discovered fact is not fundamental, it must be justified by a compelling fundamental principle. For example, in special relativity we explain length contraction as follows:

Relativity principle ##\rightarrow## Justifies light postulate ##\rightarrow## Dictating length contraction.

So, the "invariant upper limit on communication speed" (light postulate) is not the fundamental explanans in special relativity, the relativity principle is. As Norton notes (https://sites.pitt.edu/~jdnorton/papers/companion.pdf):
Until this electrodynamics emerged, special relativity could not arise; once it had emerged, special relativity could not be stopped.
Wouldn't it be nice if we had a principle explanation of entanglement that was just as compelling?
 
Last edited:
  • Like
Likes bhobba and vanhees71
  • #96
From the relativity principle (i.e., the equivalence of all inertial frames and the assumption of their existence) alone + further symmetry assumptions (homogeneity of time, euclidicity of space for all inertial observers) you can derive that there are two spacetime models: Galilei-Newton spacetime (with the Galilei group as its symmetry group) and Einstein-Minkowski spacetime (with the Poincare group as its symmetry group). To decide, which one describes nature is empirical, i.e., you cannot separate fundamentals from empirics. All fundamental laws must be grounded in solid empirical evidence.

I don't see, where there should be a general difference between the foundations of quantum theory and the description of space and time. The only difference is the lack of determinism in quantum theory and the possibility of classical, deterministic physics within the spacetime models, which have been successful to build the foundation of both classical and quantum physics.

The reluctance to accept "irreducible randomness" in the (observed!) behavior of Nature seems to me to be just a psychological phenomenon without any objective empirical foundation. At least there's only emprical evidence for this irreducible randomness, and none against it.
 
  • #97
vanhees71 said:
From the relativity principle (i.e., the equivalence of all inertial frames and the assumption of their existence) alone + further symmetry assumptions (homogeneity of time, euclidicity of space for all inertial observers) you can derive that there are two spacetime models: Galilei-Newton spacetime (with the Galilei group as its symmetry group) and Einstein-Minkowski spacetime (with the Poincare group as its symmetry group). To decide, which one describes nature is empirical, i.e., you cannot separate fundamentals from empirics. All fundamental laws must be grounded in solid empirical evidence.

I don't see, where there should be a general difference between the foundations of quantum theory and the description of space and time. The only difference is the lack of determinism in quantum theory and the possibility of classical, deterministic physics within the spacetime models, which have been successful to build the foundation of both classical and quantum physics.

The reluctance to accept "irreducible randomness" in the (observed!) behavior of Nature seems to me to be just a psychological phenomenon without any objective empirical foundation. At least there's only emprical evidence for this irreducible randomness, and none against it.
So what is the fundamental principle of QM that is similar to the invariance of the speed ot light?
 
  • #98
It's the notion of quantum states represented by statistical operators on a Hilbert space with their probabilistic interpretation.
 
  • Skeptical
Likes gentzen
  • #99
martinbn said:
So what is the fundamental principle of QM that is similar to the invariance of the speed ot light?
There are different ways to render a principle account of QM. Bohr did it using the quantum postulate (discontinuous jumps between stationary states) and the correspondence principle (quantum transitions correspond to harmonics of classical motion). [Heisenberg used these to generate his matrix formulation of QM.] More recently (starting in 90's) we have information-theoretic reconstructions of QM, e.g., Rovelli based his on non-commutativity, Bub based his on non-Boolean algebraic structure, and Hardy based his on continuity of reversible transformations between pure states for the qubit. That last one leads to Brukner and Zeilinger's fundamental principle of Information Invariance & Continuity which is the equivalent of the light postulate for SR. In information-theoretic form it's not as transparent as the light postulate, but when instantiated physically it means the measurement of Planck's constant is invariant between inertial reference frames related by spatial rotations or translations. So, the invariant measurement of the speed of light between inertial reference frames related by boosts leads to SR and the invariant measurement of Planck's constant between inertial reference frames related by spatial rotations and translations leads to QM. The relativity principle is the compelling fundamental principle justifying these empirically-discovered facts in both cases.
 
  • Like
Likes vanhees71
  • #100
Of course, QT must be compatible with the spacetime model you use and that's why you come to unitary ray representations of the Galilei or Poincare group for the Newtonian and special-relativistic spacetime model, and ##\hbar## is a scalar parameter, because it's just an arbitrary conversion factor to define the SI units.
 
  • #101
vanhees71 said:
Of course, QT must be compatible with the spacetime model you use and that's why you come to unitary ray representations of the Galilei or Poincare group for the Newtonian and special-relativistic spacetime model, and ##\hbar## is a scalar parameter, because it's just an arbitrary conversion factor to define the SI units.
Planck's constant h appears in Planck's radiation law, the Planck-Einstein relationship, and Einstein's photoelectric equation, for example. In that sense, it is a fundamental constant of Nature. As Weinberg points out, measuring an electron's spin "is a measurement of a universal constant of Nature, h." Thus, Information Invariance & Continuity applied to the electron spin measurement (a qubit) means everyone must measure the same value of h, regardless of their relative Stern-Gerlach magnet orientations. Reads just like the light postulate and is justified the same way, too.
 
  • Like
Likes vanhees71
  • #102
RUTA said:
In principle explanation, the empirically-discovered fact is not fundamental, it must be justified by a compelling fundamental principle. For example, in special relativity we explain length contraction as follows:

Relativity principle ##\rightarrow## Justifies light postulate ##\rightarrow## Dictating length contraction.

So, the "invariant upper limit on communication speed" (light postulate) is not the fundamental explanans in special relativity, the relativity principle is. As Norton notes (https://sites.pitt.edu/~jdnorton/papers/companion.pdf):

Wouldn't it be nice if we had a principle explanation of entanglement that was just as compelling?
I see I was vauge as often, I agree the relativity principles is somehow more paramount, but my main point was that the conservative notion of "relativity principles" relates specifically to spacetime frames. But how are these frames and spaces justified in the first place - without accepting the very things we aim to explain?
As I think the heart of the relativity principle (which I always take to be the observer equivalence(or preferably democracy), noting that an observer is more than just a preferred coordinate frame) is much more profound than is the notion of "space".

What that in mind, I think the conventinal meaning of "relativity principle" as PRESUMING the notion of 4D spacetime is not sufficiently compelling when trying to incorporate also gravity. This very distinction is what IMO is why QG seems to slippery.

/Fredrik
 
  • #103
RUTA said:
So, the "invariant upper limit on communication speed" (light postulate) is not the fundamental explanans in special relativity, the relativity principle is.
All it is, is the fixing of a constant that naturally occurs in the theory. It could be infinity, and intuitively you think it would be. But physics is an experimental science, and its value is the speed of light for all sorts of reasons, both theoretical and experimental. Theoretically, if you leave it as just an undetermined constant C then you can use it to derive Maxwell's equations which have solid experimental support for C being the speed of light:
http://cse.secs.oakland.edu/haskell/Special Relativity and Maxwells Equations.pdf

Thanks
Bill
 
  • Like
Likes Lord Jestocost and vanhees71
  • #104
bhobba said:
All it is, is the fixing of a constant that naturally occurs in the theory. It could be infinity, and intuitively you think it would be. But physics is an experimental science, and its value is the speed of light for all sorts of reasons, both theoretical and experimental. Theoretically, if you leave it as just an undetermined constant C then you can use it to derive Maxwell's equations which have solid experimental support for C being the speed of light:
http://cse.secs.oakland.edu/haskell/Special Relativity and Maxwells Equations.pdf

Thanks
Bill
Indeed, c was infinity until Maxwell's equations. Once you had a finite value from Maxwell's equations, the relativity principle demands everyone measure the same value for it, regardless of their inertial reference frame. That gives you SR.
 
  • Like
Likes bhobba
  • #105
Fra said:
I see I was vauge as often, I agree the relativity principles is somehow more paramount, but my main point was that the conservative notion of "relativity principles" relates specifically to spacetime frames. But how are these frames and spaces justified in the first place - without accepting the very things we aim to explain?
As I think the heart of the relativity principle (which I always take to be the observer equivalence(or preferably democracy), noting that an observer is more than just a preferred coordinate frame) is much more profound than is the notion of "space".

What that in mind, I think the conventinal meaning of "relativity principle" as PRESUMING the notion of 4D spacetime is not sufficiently compelling when trying to incorporate also gravity. This very distinction is what IMO is why QG seems to slippery.

/Fredrik
You might be interested in this article then https://www.nature.com/articles/s41467-018-08155-0.pdf
 
  • Like
Likes Fra and vanhees71

Similar threads

Back
Top