In summary: That confirms my (still superficial) understanding that now I'm allowed to interpret ##\hat{\rho}## and the trace operation as expectation values in the usual statistical sense, and that makes the new approach much more understandable than what you called before "thermal interpretation".I also think that the entire conception is not much different from the minimal statistical interpretation. The only change to the "traditional" concept seems to be that you use the more general concept of POVM than the von Neumann filter measurements, which are only a special case.The only objection I have is the statement concerning EPR. It cannot be right, because local realistic theories are not consistent with the quantum-theoretical probability theory, which
  • #211
Relativistic QFT has the same probabilistic interpretation as any QT (in fact there is only one overall conceptual framework and a non-relativistic (in both a "1st-quantization formulation" and a field-theoretical one as well as a special relativistic realization in terms of local QFTs).

Of course one can prepare ensembles within all kinds of QTs and, more importantly, in the lab. Relativistic QFT is among the best tested physical theories ever discovered. This would be impossible to achieve if it were not possible to prepare ensembles of the physical systems described by it, and these are particles and nuclei in particle accelerators, with which you can do scattering experiments with high precision. Another application is atomic physics. A specific quantum-field theoretical result is the explanation of the Lamb shift to several significant digits of accuracy, etc.

I don't understand, how one can claim that one cannot build ensembles within relativistic QFT, given all these successes. After all the first goal in all introductions to QFT is to establish the calculations of S-matrix elements, which precisely describe scattering processes, and obviously these can be realized with high accuracy in the lab.
 
Physics news on Phys.org
  • #212
I think the reason we get away with the principal problem here, is that the theory is corroborated for small subsystems. Where small refers to both spatially slow, well as short lived, at least compare to the scales of the inferences going on during the experiment. So the symptoms of the principal problems are IMO not to be found in atomic physics, it's to be find either when one includes also gravity, and times which are not infinitesimal compare cosmological scale, and perhaps also when trying to understand the unifcation of forces that are currenyl described at extremely wide energy ranges.

It seems quite obivous, that in order for an experiment to be able to produce alot of "statistics" of from effectively asymptotic scattering experiments, it simpy will not be possible to acquire and process that and make preparations if the significant scale of interactions was the whole universe in space and time. But it would work if it's all happening withing a laboratory building at quick pace?

I agree that this objection is almost silly for practical purposes of atomic or even macroscopic solid state physics(which is still microscopic on cosmo scale), but if one considers it a principal argument I don't see how it can not be valid?

/Fredrik
 
  • #213
Well, all our observations are pretty local. In cosmology we extrapolate the local observations to "the whole universe" by assuming the cosmological principle.
 
  • Like
Likes bhobba and Fra
  • #214
Yes, we effectively assume that our - admittedly incomplete sampling and truncated postprocessing - is still a fair representation. We know why we do this and the assumption is not irrational. The only issue is if we get so used to this that we confuse this rational guess, with a fact. My first line of thought is not that I question the spatial cosmological principle, but what I think we call "perfect cosmological principle" which also says the laws of physics do not change. There is as far as I know no cosmological evidence that the laws did change, but OTOH the astronomical evidence are mainly coming from when there was light. And in the perspective of unification of forces the universe was relatively speaking OLD at this point.

Even in Smolins speculation of evolution of law, had admits that no evidence suggests the laws we can infer has changed. His idea is that laws changed during hte big bang, but I would say wether it's AT the big band, or when all forces was presumably blurred up and before the notion of spacetime was clear, we can not tell.

So I personally do not think the perfect cosmological principle is something to hold onto in the context of this discussion. And of course the whole notion of the spatial cosmological principle also becomes questionable if we thinkg that there was an early time before 4D spacetime was stable.

/Fredrik
 
  • Like
Likes vanhees71
  • #215
Because Bernhard wrote at https://www.astronews.com/community...etation-der-quantenmechanik.11402/post-137032 that he has "only" read Born's rule and measurement from A. Neumaier:
Bernhard said:
Ich muss mir dann erstmal das zitierte Paper genauer ansehen und hatte vorab schon etwas ein anderes paper von AM angesehen: Born's rule and measurement ebenfalls auf arxiv.org. Dort finden sich auch Iesenswerte Ideen und Anregungen.
I tried to remember my own impression of that paper. Somehow it felt boring to me, and I was skeptical, whether its "new approach to introductory courses on quantum mechanics" would be an improvement. But maybe this was just because I read this paper more out of a "felt duty" than out of curiousity.

The current paper feels totally different to me, even so I realize that it is somehow supposed to be a successor or elaboration of that paper. It picks my curiosity at thousand different small and bigger points, and is a complete joy to read with all its interesting quotes and motivations. But that old paper also has quotes, so that can't be the difference. Maybe it was really just my own state of mind when I read it. But I think it is more: I get many of the points that the current paper wants to achieve, but I had a hard time identifying clear goals of that previous paper.
 
  • #216
gentzen said:
Maybe you can post there a link to the new paper!

gentzen said:
I tried to remember my own impression of that paper. Somehow it felt boring to me, and I was skeptical, whether its "new approach to introductory courses on quantum mechanics" would be an improvement. But maybe this was just because I read this paper more out of a "felt duty" than out of curiousity.

The current paper feels totally different to me, even so I realize that it is somehow supposed to be a successor or elaboration of that paper. It picks my curiosity at thousand different small and bigger points, and is a complete joy to read with all its interesting quotes and motivations. But that old paper also has quotes, so that can't be the difference. Maybe it was really just my own state of mind when I read it. But I think it is more: I get many of the points that the current paper wants to achieve, but I had a hard time identifying clear goals of that previous paper.
The difference of my 2019 paper "Born's rule and meansurement" and my new quantum tomography paper is that the former (30 pages of main text) was a programme (stated on 3 pages in Subsection 2.6) to be executed:
''The material presented suggests a new approach to introductory courses on quantum mechanics''
while the latter (90 pages of main text) is the general part of its execution. I go through many of the main quantum physics topics that must be discussed and show what they mean in terms of the new detector response principle.

Of course, to make it into a text on introductory quantum mechanics, one would need many more examples to which the concepts are applied, and an introduction into how to compute all the stuff treated abstractly in the paper.
 
  • Like
Likes gentzen
  • #217
gentzen said:
Bernhard said:
Mich interessiert jetzt natürlich stark, welche Details nun im Rahmen dieser Interpretation den Startpunkt einer Nebelspur in einer Nebelkammer festlegen.
(Of course, I am now very interested in which details determine the starting point of a trail in a cloud chamber within the framework of this interpretation.)
and in a related discussion:
TomS said:
Wie gelangt die TI von einem sphärisch symmetrischen Zustand zu einem lokalisierten Zustand?
As a response you could point to Subsection 4.4 (What is a particle?) in my paper ''Foundations of quantum physics I. A critique of the tradition", where I discuss this question in parallel with a corresponding experiment with classical particles (4mm bullets from an air gun).
Arnold Neumaier (p.33) said:
The paper by Schirber [54] discusses essentially the same phenomenon in a fully classical context, where a bullet is fired into a sheet of glass and produces a large number of radial cracks in random directions. This is shown in the first figure there, whose caption says,

”The number of cracks produced by a projectile hitting a glass sheet provides information on the impactor’s speed and the properties of the sheet.” In the main text, we read ”A projectile traveling at 22.2 meters per second generates four cracks in a 1-millimeter-thick sheet of Plexiglas. [...] A 56.7 meter-per-second projectile generates eight radial cracks in the same thickness Plexiglass sheet as above.” (See also Falcao & Parisio [20] and Vandenberghe & Villermaux [60].)

We see that the discrete, random detection events (the cracks) are a manifestation of broken symmetry when something impacts a material that – unlike water – cannot respond in a radially symmetric way. Randomness is inevitable in the breaking of a radial symmetry into discrete events. The projectile creates an outgoing spherical stress wave in the plexiglas and produces straight cracks. In fact, once initiated, the growth of a crack in a solid is not very different from the growth of a track in a bubble chamber, except that the energies and time scales are quite different. Only the initiation is random.
There we have exactly the same phenomenon: A radially symmetric source (here an impact impact center) causes a discrete number of cracks, solely through the properties of the detecting matter. The paper [54] is open access and has very nice pictures:
 
Last edited:
  • Like
Likes gentzen
  • #218
gentzen said:
Bernhard said:
Wenn wir die Idee der Lokalisierung übertreiben, gelangen wir zu einem Bild, das Neumaier teilweise suggeriert, nämlich das der chaotischen Hydrodynamik. Aber klassische chaotische Hydrodynamik wird sicher die Bellschen Kriterien verletzen, d.h. sie ist lokal-realistisch, was wir durch andere Experimente ausschließen können.
(If we exaggerate the idea of localization, we arrive at an image that Neumaier partly suggests, namely that of chaotic hydrodynamics. But classical chaotic hydrodynamics will certainly violate Bell's criteria, i.e. it is local-realistic, which we can exclude by other experiments.)
Hydrodynamics is a coarse-grained approximation in which only 1-point functions are retained. Hence it cannot model microscopic correlation experiments. For the analysis of coincidence measurements one needs the dynamics of 2-point functions, which is not obtained correctly in the hydrodynamic approximation.
But one can work instead with the Kadanoff-Baym equations, which are the bilocal analogue of the hydrodynamic equations, and from which one can get the hydrodynamic equations through further coarse-graining.

The underlying determinstic dynamics is that of the ##N##-point functions of quantum field theory, giving a coupled system for ##N=0,1,2,\ldots##. This dynamics is local in the sense of quantum field theory, i.e., conforming to relativistic causality. However, the dynamics is nonlocal in the sense of Bell
since ##N##-point functions for ##N>1## contain more than one spacetime argument. Therefore Bell's analysis does not apply.
 
  • #219
https://www.astronews.com/community...etation-der-quantenmechanik.11402/post-137082 :
TomS said:
Das zusammen ergibt die Hypothese, dass die deterministische Zeitentwicklung des Zustandes alles enthält, was man benötigt, um alle objektiven Eigenschaften eines Systems zu berechnen, einschließlich der Ergebnisse einer Messung bzw. eines einzelnen Detektor-Ereignisses. Eine vollständige Lösung des Messproblems umfasst damit auch die Notwendigkeit, zu berechnen, dass genau eines und genau welches der Detektor-Elemente ein "teilchenartiges Detektorereignis" anzeigt.
(Altogether, this leads to the hypothesis that the deterministic time evolution of the state contains everything needed to calculate all the objective properties of a system, including the results of a measurement or a single detector event. A complete solution to the measurement problem thus also includes the need to calculate that exactly one and exactly which of the detector elements indicates a "particle-like detector event".)

Der vermeintliche Indeterminismus ist - analog zur klassische Mechanik - keine intrinsische Eigenschaft der Quantenmechanik. Heißt: wenn ich den initialen Zustand genau genug kenne und die Zeitentwicklung sowie die geeigneten q-expectations genau genug berechne, dann folgt daraus auch, welches einzelne Detektor-Element anspricht. Umgekehrt: wäre dem nicht so, gäbe es ein objektiv zufälliges Element - was Neumaier analog zur klassischen Mechanik explizit ausschließt - das nicht aus dem Zustand folgen würde - was er ebenfalls explizit ausschließt, da dieser alle objektiven Eigenschaften des Systems repräsentiert.
(The apparent indeterminism is - analogous to classical mechanics - not an intrinsic property of quantum mechanics. This means: if I know the initial state exactly enough and calculate the time development as well as the appropriate q-expectations accurately enough, then it also follows which individual detector element responds. Conversely: if this were not the case, there would be an objectively random element - which Neumaier explicitly excludes analogously to classical mechanics - which would not follow from the state - which he also explicitly excludes, since it represents all objective properties of the system.
This is a good exposition of part of my thermal interpretation.
TomS said:
Neumaier gelangt damit m.E. zu einem letztlich trivialen Bild: aus einem exakt bekannten Zustand und einer exakt bekannten und berechenbaren Dynamik folgt ein exaktes Ergebnis inkl. genau eines Detektor-Ereignisses.
(In my opinion, Neumaier thus arrives at an ultimately trivial picture: from an exactly known state and a precisely known and calculable dynamics, an exact result including exactly one detector event follows.)
Nontrivial and new are here indeed only the implied analysis of what precisely the objective properties are, and how they are encoded in the quantum formalism.
TomS said:
Das steht im Widerspruch zur Dekohärenz (This contradicts decoherence)
No - there is no contradiction with decoherence since the latter is completely silent about the behavior of single systems.

Decoherence only claims to compute ensemble properties, where the associated reduced density operator diagonalizes (in the pointer basis of selected experiments) after averaging over very high frequencies, which is a consequence of the Riemann–Lebesgue Lemma. The quantum tomography approach and the thermal interpretation refine the story told by decoherence in terms of averages to a different, more detailed story for each single case.
 
Last edited:
Back
Top