QM is Feynman path ensemble - is QFT Feynman field ensemble?

In summary, in QM, the Feynman ensemble of trajectories is used to derive Brownian motion. In classical mechanics, single action optimizing trajectories are used. QFT can be formulated as Feynman ensemble of fields, with each field configuration corresponding to a Feynman diagram. There is a relationship between the statistical mechanics (ensembles) of classical fields and the path integral in quantum field theory. Soliton models are an example of an intermediate approach between classical and quantum mechanics.
  • #1
Jarek 31
158
31
While classical mechanics uses single action optimizing trajectory, QM can be formulated as Feynman ensemble of trajectories.
As in derivation of Brownian motion, mathematically it is convenient to use nonphysical: nowhere differentiable trajectories - should it be so?

Can this connection be taken to field theories - can we see QFT as Feynman ensemble of fields?
Nowhere differentiable ones or more physical?
With concrete field configurations corresponding to each Feynman diagram?

Generally: what are the similarities and differences between classical and quantum field theories?
E.g. in classical we remove infinities with regularization, in quantum with renormalization and cutoffs - is there any connection between them?
 
  • Like
Likes Heidi and vanhees71
Physics news on Phys.org
  • #2
Yes, in QFT the path integral is over field configurations.

There is a relationship between the statistical mechanics (ensembles) of classical fields and the path integral in quantum field theory.
https://www.damtp.cam.ac.uk/user/tong/sft/two.pdf
These are notes on statistical field theory, but the similarity with path integrals in quantum field theory is discussed in Section 2.3.
 
  • Like
Likes Heidi, vanhees71 and Jarek 31
  • #3
Great, I also would like to think so, but am not completely certain if it is appropriate (?)
Feynman diagram is kind of higher abstraction level - there should be e.g. ensembles of EM field configurations behind charged/magnetic dipole particles in Feynman diagrams (?)
1622128991109.png


So maybe we can also translate e.g. between methods of handling infinities between them?
In classical field theories we usually use potential for this purpose - e.g. sine-Gordon ( https://en.wikipedia.org/wiki/Sine-Gordon_equation ) enforcing varphi = + or -1 would require infinite energy of kink: going e.g. from -1 to +1. It is smoothed to finite energy thanks to potential.
Is there a connection between such classical potential-based regularization and QFT renormalization?
 
  • #4
I remember statistical field theory from great Fisher's book, also in perturbative QFT approximation we get Feynman diagrams indeed from field ensemble - by using Taylor expansion and the Gaussian integration trick:
1622133284865.png


But I really miss the question of e.g. electric field around electron in such Feynman diagram - can we tell anything about it (e.g. after also fixing momentums), ensemble of such fields?
Shouldn't it be E~1/r^2, leading to rho~1/r^4 mean energy density around such electron?
However, it would integrate to infinite energy - classically requiring regularization/deformation: smoothing to finite energy, while in QFT we remove this infinity with cutoffs and renormalization - shouldn't there be a connection between them?

QFT is being approached from these opposite worlds: nonperturbative e.g. lattice completely forgetting about particles, and abstract Feynman diagrams - aren't there trials for intermediate approaches: e.g. just asking for ensembles of field configurations behind each Feynman diagram?

Aren't soliton models example of such intermediate approach? To consider their scattering with incomplete knowledge, we also need to consider ensemble of scenarios: Feynman diagrams.
Here is some Feynman diagram article for sine-Gordon model: "On the renormalization of the sine–Gordon model" https://arxiv.org/pdf/hep-th/0505276.pdf
 
  • Like
Likes vanhees71
  • #5
i do not know if this is a good rephrasing:
the wick rotation (1 -> i) relates two physical models , classical to quantum.
do changing 1 -> (1 + i) / sqrt 2 correspond to an intermediate model between classical and quantum?
 
  • #6
Wick rotation is a bit different - changes e.g. statistical field theory with quantum field theory.

Earlier: "Wick rotating" Feynman path ensemble, we get Boltzmann path ensemble - e.g. as in euclidean path integrals, maximal entropy random walk, or Ising model - this mathematical analogy e.g. allows to construct Bell-violation examples, or QM-like computers in Ising model.

In this QM-statistical physics analogy:
- mechanics: classical approximation means single action optimizing trajectory, QM means Feynman path ensemble,
- statistical: the system is approximately in single: the lowest energy state, but more precisely: it is in Boltzmann ensemble among possible states.
 
  • Like
Likes vanhees71
  • #7
Heidi said:
i do not know if this is a good rephrasing:
the wick rotation (1 -> i) relates two physical models , classical to quantum.
do changing 1 -> (1 + i) / sqrt 2 correspond to an intermediate model between classical and quantum?
Wick rotation is just a mathematical trick to map a calculational problem in vacuum QFT from Minkowski to Euclidean space. In this Euclidean version of the QFT it's easier to argue about some formal mathematical aspects, e.g., in renormalization theory (see the famous work by BPHZ). You can evaluate all the (perturbative) proper vertex functions and thus ##N##-point Green's functions of the theory in Euclidean field theory, but than you have the trouble to analyically continuate back to "real time" (Minkowski space). This has nothing to do with some "classical to quantum" transition of something like that.

Another example, where imaginary times occur in physics is in (both relativistic an non-relativistic) many-body QFT. In thermal equilibrium you need to evaluate expectation values of field operator products ("thermal Green's functions") with the statistical operator ##\hat{\rho}=\exp(-\beta \hat{H}+\mu \beta \hat{Q})/Z## (and you also have to evaluate ##Z##, the partition sum too!). Apart from the partition sum in the denominator the statistical operator looks similar to a time-evolution operator ##\hat{U}=\exp(-\mathrm{i} \hat{H} t)## with ##t=-\mathrm{i} \beta##.

This implies that you can formulate a perturbation theory like in vacuum QFT when formally setting ##t=-\mathrm{i} \tau##, where ##\tau \in (0,\beta)##. Since the expectation values of some operator ##\hat{A}## are given by the trace ##\mathrm{Tr}(\hat{\rho} \hat{A})##, it turns out that you have to impose periodic (antiperiodic) boundary conditions for bosonic (fermionic) field operators.

The result are thus perturbative Feynman rules and Feynman diagrams which look identical with the vacuum Feynman diagrams, but instead of an integral over the energies in the vacuum case you have sums over the discrete Matsubara frequencies ##\omega_n =2 \pi n \beta## (bosons) or ##\omega_n=(2n+1) \pi n## (fermions) with ##n \in \mathbb{Z}##, and the propagators are propagators of the Euclidean-field-theory free fields.

Also this has nothing to do with some classical vs. quantum transition.
 
  • Like
Likes Heidi and Jarek 31
  • #8
vanhees71 said:
Also this has nothing to do with some classical vs. quantum transition.
I have to admit that I don't understand the difference between classical and quantum phase transition (?)
For example as the simplest quantum phase transition model there is usually pointed the transverse-field Ising model ... but it is not a problem to also solve it as just Boltzmann ensemble, and turns out predictions are quite similar: "Classical Ising chain in transverse field" https://www.sciencedirect.com/science/article/pii/S0304885306016295
 
  • Like
Likes vanhees71
  • #9
Sure, but isn't this rather the (semi-)classical limit of (equilibrium) statistical mechanics, i.e., where you can approximate the Bose-Einstein/Fermi-Dirac distributions by the Boltzmann distribution?
 
  • #10
Bose-Einstein/Fermi-Dirac distributions can be also considered combinatorically: just choosing if the number of particles per node is N or {0,1}, and energy e.g. as in Bose-Hubbard n(n-1).

The Wick rotation difference is more subtle, e.g.:
- the minimal energy state approximation is natural for Boltzmann distribution, while for Feynman we need more subtle e.g. saddle-point method for this kind of approximation,
- in Boltzmann distribution we lose phase - e.g. removing interference,
- in Boltzmann we get attraction to the lowest energy state, while in Feynman the excited states are still stable (parabolic vs hyperbolic evolution) - e.g. the euclidean path integrals are numerically mainly used to find the ground state, in maximal entropy random walk excited states are metastable.

Here is MERW ( https://en.wikipedia.org/wiki/Maximal_entropy_random_walk - Boltzmann path ensemble) evolution example in defected lattice - it sees excited states, but they are not stable as in Schrodinger - it goes to stationary density as in ground state Schrodinger:
1622197764013.png
 
Last edited:
  • #12
Exactly, they focus on "Euclidean gauge", there is no imaginary 'i' in exponent - this is Wick rotated picture: focused on the lowest energy ground state.
 
  • #13
Sure, Lattice QCD only works in Euclidean QFT formulation. Realtime quantities, as e.g. transport coefficients, are a big problem there!
 
  • Like
Likes Jarek 31
  • #14
Indeed, but mathematically this makes it just ("classical") Boltzmann distribution - statistical mechanics on fields, neglecting quantum phase - blurring the boundary between classical and quantum mechanics.

Or MERW - just "classical" random walk chosen accordingly to Jaynes principle of maximal entropy ( https://en.wikipedia.org/wiki/Principle_of_maximum_entropy ) - mathematically leading to the same stationary probability distribution as ground state of quantum models like Hubbard or Schrodinger.

It brings question of understanding the classical - quantum boundary?
The single lowest energy state -> Boltzmann ensemble, is analogous to single optimal action trajectory -> Feynman ensemble.

Naive response is that phase is crucial for QM - it is true e.g. for interference, but many "quantum" phenomena are already there e.g. in Boltzmann ensemble like Anderson localization ( https://en.wikipedia.org/wiki/Anderson_localization ) already obtained in MERW ( https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.102.160602 ).
 
Last edited:
  • #15
Jarek 31 said:
Wick rotation is a bit different - changes e.g. statistical field theory with quantum field theory.
Jarek 31 said:
"Wick rotating" Feynman path ensemble, we get Boltzmann path ensemble - e.g. as in euclidean path integrals,
Feynman paths (resp. fields) form an ensemble only in imaginary time QM (resp. Euclidean QFT), since only there one has a meaningful (i.e., positive) measure. In real time QM the Feynman measure is well-defined but complex, but one cannot associate stochastic processes (hence ensembles) to complex measures. In interacting Minkowski QFT, the Feynnman measure is not even well-defined.
 
  • Like
Likes mattt, Jarek 31, dextercioby and 1 other person
  • #16
Jarek 31 said:
Indeed, but mathematically this makes it just ("classical") Boltzmann distribution - statistical mechanics on fields, neglecting quantum phase - blurring the boundary between classical and quantum mechanics.

Or MERW - just "classical" random walk chosen accordingly to Jaynes principle of maximal entropy ( https://en.wikipedia.org/wiki/Principle_of_maximum_entropy ) - mathematically leading to the same stationary probability distribution as ground state of quantum models like Hubbard or Schrodinger.

It brings question of understanding the classical - quantum boundary?
The single lowest energy state -> Boltzmann ensemble, is analogous to single optimal action trajectory -> Feynman ensemble.

Naive response is that phase is crucial for QM - it is true e.g. for interference, but many "quantum" phenomena are already there e.g. in Boltzmann ensemble like Anderson localization ( https://en.wikipedia.org/wiki/Anderson_localization ) already obtained in MERW ( https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.102.160602 ).
No, the imaginary-time (Matsubara) formalism takes full quantum statistics into account. The ground state is of course a special case, namely ##T \rightarrow \infty## (##\beta \rightarrow 0##).
 
  • Like
Likes atyy
  • #17
Regarding "imaginary time", I haven't seen any reasonable interpretation(?), in spacetime we have 4 real coordinates.
So maybe instead of interpreting it as imaginary time, cannot we just interpret it as Boltzmann distribution - statistical physics? (just mathematically universal principle of maximal entropy).

Regarding Feynman ensembles, indeed phase makes it more complicated, classical mechanics is obtained in approximations like the saddle-point method, van Vleck determinant.

Regarding excited states in euclidean/imaginary time, statistical physics interpretation, I have mentioned that they indeed are there - but as metastable, no longer stable.
Feynman ensemble leads to hyperbolic PDEs with eigenvalues on unitary circle, while Boltzmann to parabolic PDEs with eigenvalues on real axis.
Like in the MERW evolution diagram above, starting close to excited state, the density will first localize there, but finally it will "deexcite" to the ground state.
It is different than e.g. in Schrodinger, where excited states are stable - excited atom requires external perturbation to deexcite there.
Should atom remain excited indefinitely without external perturbations?
 
  • Like
Likes vanhees71
  • #18
As I said "imaginary time" is just a mathematical trick to evaluate Feynman diagrams (both in vacuum and equilibrium QFT). For the fully general off-equilibrium case there's the Schwinger-Keldysh time-contour formalism, and here you are best off with the completely real time contour (but that's a matter of opinion, because there's also "thermo-field dynamics", where you go first along the real axis then half-way down to ##t_{\text{end}}-\mathrm{i} \beta/2## then back parallel to the real axis to ##t_{\text{ini}}-\mathrm{i} \beta/2## and then again parallel to the imaginary axis to the final point ##t_{\text{init}}-\mathrm{i} \beta## :-).
 
  • Like
Likes Jarek 31
  • #19
Sure, "imaginary time" is rather just a mathematical trick.

But Boltzmann distribution isn't - it is the effect of the principle of maximal entropy, e.g. while fixing mean energy. https://en.wikipedia.org/wiki/Principle_of_maximum_entropy :
The principle of maximum entropy states that the probability distribution which best represents the current state of knowledge about a system is the one with largest entropy, in the context of precisely stated prior data (such as a proposition that expresses testable information).
Principle of maximal entropy is just universal combinatorics - e.g. from Stirling approximation:
binomial(n,pn) ~ 2^{n h(p)}
for h(p) = - p lg(p) - (1-p)lg(1-p) Shannon entropy - having single maximum h(1/2)=1.
Hence having length n sequence of 0 and 1, focusing on sequences having pn of 1s, the safest choice is maximizing entropy p=1/2 as such subset asymptotically dominates any other assumptions of p.

While "imaginary time" interpretation is just a mathematical trick, mathematically equivalent "Boltzmann distribution" interpretation is literally the safest choice from universal mathematical perspective ...
So which interpretation is "better"?
 
  • Like
Likes vanhees71
  • #20
Indeed, the canonical or grand-canonical stat. op. derives from the maximum-entropy principle and can be most convincingly argued about when introducing entropy as a measure of lack of information, which must be maximized given the known constraints in order to find the statistical description with the minimal prejudice. On the other hand it also follows from kinetic theory, aka the Boltzmann(-Uehling-Uhlenbeck) equation.
 
  • Like
Likes Jarek 31
  • #21
Such Boltzmann perspective is more natural than "rotation to imaginary time", can be well understood.
The difficulty of Feynman ensemble is adding the quantum phase, e.g. leading to interference.

For example there are hydrodynamical analogs of Casimir effect - e.g. two plates in liquid get attracted if shaking the tank ( ).
Quantum Casimir effect, instead of such external shaking, needs some intrinsic energy source leading to the phase evolution ... like de Broglie clock/zitterbewegung of particle, confirmed experimentally for electron ( https://link.springer.com/article/10.1007/s10701-008-9225-1 ).
 
  • Love
  • Like
Likes dRic2 and vanhees71
  • #22
Jarek 31 said:
While "imaginary time" interpretation is just a mathematical trick, mathematically equivalent "Boltzmann distribution" interpretation is literally the safest choice from universal mathematical perspective ...
So which interpretation is "better"?
Euclidean QFT is the analytic continuation of ordinary (Minkowski or Galilei) QFT to imaginary time. It changes the dynamics completely and is mathematically equivalent (Osterwalder-Schrader theorem) only by changing the meaning of all terms. A physical interpretation is possible only in the Minkowski version. In the Euclidean version all information obtained must be reinterpreted before it makes physical sense - e.g., Euclidean decay rates become Minkowski frequencies. Thus the only safe interpretation is the real-time = Minkowski version
(Feynman speudo-stochastic view), while the imaginary-time = Euclidean version, stochastic ensemble view) is only a computational proxy for some field theories (in practice only lattice QCD) .

Except in equilibrium, where time does not matter and 4D QFT reduces to 3D statistical mechanics with its standard stochastic interpretation.
 
  • Like
Likes dextercioby and vanhees71
  • #23
But from the other side, Boltzmann distribution is mathematically universal application of the maximal entropy principle - there are very strong mathematical motivations to use it.
Why can't we focus for a moment on this motivation - see the situation from just this perspective?
E.g. as approximation before adding quantum phase in Feynman ensemble?

Also, practical difference between them is if excited states are stable in Feynman or metastable in Boltzmann ensemble - e.g. is excited atom a stable or metastable state?
In other words: should excited atom in completely empty and calm universe deexcite?

Ps. For classical objects with wave-particle duality they also observe orbit quantization (e.g. https://www.nature.com/articles/ncomms4219 , https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.113.104101 ) or quantum-like statistics (e.g. https://journals.aps.org/pre/abstract/10.1103/PhysRevE.88.011001 ).
 
  • #24
A. Neumaier said:
Except in equilibrium, where time does not matter and 4D QFT reduces to 3D statistical mechanics with its standard stochastic interpretation.
With the important qualification that in this case the fields must obey symmetric for bosons (anti-symmetric for fermions) in imaginary time with period ##\beta=1/T##.
 
  • #25
This equilibrium in intersection between these two (quantum and statistical physics) worlds is the ground state, and usually (e.g. Schrodinger) it is non-degenerated and its wavefunction can be chosen as real nonnegative (generalization of Frobenius-Perron theorem) - the quantum phase disappears (e.g. Faris articles like https://projecteuclid.org/journals/duke-mathematical-journal/volume-42/issue-3/Degenerate-and-non-degenerate-ground-states-for-Schrödinger-operators/10.1215/S0012-7094-75-04251-9.short ).
 
Last edited:
  • #26
But the equilibrium case is very special. The max entropy principle only holds for completely isolated systems in equilibrium. In some approximation it holds for systems at fixed time in local equilibrium, but this requires singling out a time, hence is not a good basis for relativistic quantum fields.
 
  • Like
Likes vanhees71
  • #27
Schrodinger equation usually also represents isolated system of nucleus and one or a few electrons.
And from experiment we know that such excited atom has tendency to deexcite to the ground state equilibirum - exactly as Boltzmann path ensemble would predict.

In contrast, Feynman path ensemble predicts that such atom should be excited forever ...
 
  • #28
Only if you neglect spontaneous emission due to the coupling of the charges to the quantized em. field.
 
  • #29
Sure, there are no doubts that Feynman path ensemble can be extended to agree with experiments ... but Boltzmann here just agrees in this matter "out-of-the-box".
 
  • #30
Jarek 31 said:
Schrodinger equation usually also represents isolated system of nucleus and one or a few electrons.
And from experiment we know that such excited atom has tendency to deexcite to the ground state equilibirum - exactly as Boltzmann path ensemble would predict.
No - an excited state of an isolated system is completely stable. It decays to the ground state only when exposed to an external field. The quantitatively exact decay is not described by statistical mechanics but by perturbation theory (interaction with a continuous spectrum). See, e.g., the Wikipedia article on spontaneous emission; a simpler model is the Weisskopf-Wigner atom.
 
  • #31
The isolated system however is given by the nucleus, the electrons and the (quantized) em. field. That's why also without an additional electromagnetic field the atom gets deexcited by spontaneous emission of photons (as you write yourself).
 
  • #32
Sure, from Feynman ensemble perspective.
But simultaneously there works also thermodynamical/statististical physicics and its perspective, e.g. having tendency to increase entropy for example by spreading energy like
"excited atom -> deexcited atom + photon"
... against QM unitarity maintaining von Neumann entropy.

I think the main problem with understanding of quantum mechanics is trying to see everything from single perspective, while physics is not that simple - there are multiple perspectives complementing each other.
We have wave-particle duality: while Feynman ensemble focuses on the wave part, Boltzmann ensemble is more focused on the particle part of the duality.
 
  • #33
There is no wave-particle duality in modern quantum theory. The example with the atom is QFT at 0 temperature, i.e., there's no Boltzmann ensemble.
 
  • #34
So why we observe entropy growth in unitary quantum evolution?
 
  • #35
Entropy stays constant under unitary time evolution.
 

Similar threads

Replies
1
Views
1K
Replies
5
Views
2K
Replies
134
Views
8K
Replies
13
Views
1K
Replies
6
Views
1K
Replies
34
Views
4K
Replies
26
Views
2K
Replies
36
Views
4K
Back
Top