Complex Numbers Not Necessary in QM: Explained

In summary, the conversation discusses the necessity of complex numbers in physics, particularly in quantum mechanics. While some argue that they are not needed and can be replaced with other mathematical tools, others point out that complex numbers have unique properties that are important in applications. The conversation also touches on the use of real numbers in physics and how they can be difficult to justify physically. Ultimately, the question is raised as to why complex numbers are singled out for removal in quantum mechanics, when other mathematical abstractions are accepted and used in physics.
  • #211
Tendex said:
Hope he can answer himself what he exactly meant.
By unitary evolution I am referring to the constant time evolution in the Schrodinger picture, exemplified by the constancy of the scalar product of two elements of the Hilbert space.

The holomorphicity I am referring to is that of the complex geometric nature underlying the entire projective Hilbert space itself, i.e. the fact that the theory of Riemann surfaces directly and intimately underlies the mathematical framework of QM, even directly tying into spin.

In contrast, seen as a direct application of this purely mathematical framework, measurements - i.e. orthogonal alternatives - are reflective, i.e. complex conjugate, and therefore precisely non-holomorphic and therefore necessarily inconsistent with the very theory of Riemann surfaces underlying the entire projective Hilbert space description.
Tendex said:
On rereading my post #201 above where I was referring to the following assertion by @Auto-Didact :"unitary evolution is a completely holomorphic notion"
To be clear, none of these are my own original assertions, I'm merely parroting old mathematical physics literature; doing QM using complex projective geometry is very much old hat. A few sources:
- https://arxiv.org/abs/gr-qc/9706069
- https://arxiv.org/abs/quant-ph/9906086
- Nielsen & Chuang
 
Physics news on Phys.org
  • #212
Auto-Didact said:
By unitary evolution I am referring to the constant time evolution in the Schrodinger picture, exemplified by the constancy of the scalar product of two elements of the Hilbert space.

The holomorphicity I am referring to is that of the complex geometric nature underlying the entire projective Hilbert space itself, i.e. the fact that the theory of Riemann surfaces directly and intimately underlies the mathematical framework of QM, even directly tying into spin.

In contrast, seen as a direct application of this purely mathematical framework, measurements - i.e. orthogonal alternatives - are reflective, i.e. complex conjugate, and therefore precisely non-holomorphic and therefore necessarily inconsistent with the very theory of Riemann surfaces underlying the entire projective Hilbert space description.

To be clear, none of these are my own original assertions, I'm merely parroting old mathematical physics literature; doing QM using complex projective geometry is very much old hat. A few sources:
- https://arxiv.org/abs/gr-qc/9706069
- https://arxiv.org/abs/quant-ph/9906086
- Nielsen & Chuang
Thanks for clarifying that you were thinking about the projective Hilbert space when talking about holomorphicity. This is no longer the Hilbert complex vector space of states in Schrodinger's time evolution linear equation(not even a vector space so not a subset of Hilbert space, a different new space), but is instead related to rays and physical measurements and it is important to not confuse the Hilbert space and the projective space even if we can manage to work with a projective representation that is in bijection with linear representations of the relevant covering groups of Lie groups used in quantum theory, at least for the purpose of the problem of mathematical incompatibility you were putting forward between the unitary part and the measurement part since by referring to the projective space you are mixing them and this invalidates the argument. Basically because using the physical equivalence relation that allows going from vector space to projective space requires the notion of measurement of which the Born rule gives the probability for a system on the given state, and then the clean separation between holomorphic evolution and non-holomorphic measurements you draw is not possible anymore.
 
Last edited:
  • Like
Likes Auto-Didact
  • #213
Tendex said:
at least for the purpose of the problem of mathematical incompatibility you were putting forward between the unitary part and the measurement part since by referring to the projective space you are mixing them and this invalidates the argument.
I'm not mixing them, I'm saying Hilbert space is conceptually speaking a secondary notion of the more primary notion of projective Hilbert space, i.e. given projective Hilbert space, Hilbert space itself practically becomes an obsolete concept.

In other words, I am saying that textbook QM (as well as orthodox QM) - which contains the mathematical inconsistency, reflected conceptually as the measurement problem - is merely an approximation of a deeper description - namely the geometric QM framework from mathematical physics - which naturally resolves the mathematical inconsistency in the orthodox/textbook QM framework by mathematically going far beyond them and so even naturally offering solutions to the measurement problem.

All of this is accomplished in geometric QM by incorporating the tools of projective geometry, differential geometry, symplectic geometry, Riemann surfaces, algebraic geometry, nonlinear dynamics and so on as actually being not merely optionally applied methodologies to an already completed framework of QM, but instead generally unrecognized necessary and unique aspects of the mathematical framework actually underlying QM.

This more sophisticated point of view exposes the generally given orthodox foundation of QM based on operator algebra axiomatics - i.e. the traditionally given foundation learned in school - as foundationally vacuous, and instead exposes it as a purely instrumentalist operationalization i.e. an unwarranted attempt to justify QM as an empirical science, which is moreover completely unnecessary because physics itself as a mathematical object/framework/method already fulfills this justification through the original framework of first principles laid down by Newton.
Tendex said:
Basically because using the physical equivalence relation that allows going from vector space to projective space requires the notion of measurement of which the Born rule gives the probability for a system on the given state, and then the clean separation between holomorphic evolution and non-holomorphic measurements you draw is not possible anymore.
Going to projective Hilbert space doesn't require the notion of measurement per se, because one could've started purely geometrically and literally defined measurement from there, purely from geometric first principles instead of the other way around, i.e. basing the theory on experimental results such as what was done historically.

It is precisely the possibility of adopting this ahistorical view of what is mathematically more natural what defines what is actually necessary for a physical theory viewed as a mathematical framework or object; incidentally, this is why that is a quite standard view in theory construction in the practice of mathematical physics, (classical i.e. non-formalist) pure mathematics and modern applied mathematics, all opposed to modern theoretical physics which instead heavily puts undue emphasis on experiments. History teaches us that we must always attempt to go beyond experiment using the view of the former, not the latter.
 
  • #214
Tendex said:
and then the clean separation between holomorphic evolution and non-holomorphic measurements you draw is not possible anymore.
Incidentally, there is a theory/large research programme in mathematical physics which precisely utilises standard algebraic geometry methods applied to projective geometric spaces in order to explicitly retain full holomorphicity by literally redefining complex conjugation into an analytic object through a unique algebraic duality which is completely isomorphic to the standard quantization procedure of replacing variables with operators.
 
  • #215
Auto-Didact said:
geometric QM framework from mathematical physics - which naturally resolves the mathematical inconsistency in the orthodox/textbook QM framework by mathematically going far beyond them and so even naturally offering solutions to the measurement problem.
How should this settle the measurement problem? It still suffers from the unique outcome problem.
Auto-Didact said:
one could've started purely geometrically and literally defined measurement from there, purely from geometric first principles
How? How then is Born*s rule derived from such a definition of measurement?
 
  • #216
A. Neumaier said:
How should this settle the measurement problem? It still suffers from the unique outcome problem.
It reduces the entire scheme to a holomorphic one; this is already more mathematically consistent than the orthodox framework which is a forced combined patchwork of two incompatible mathematical frameworks. Moreover, the geometric framework naturally suggests multiple pathways to unification with GR. This is addressed in Ashtekar 1997, I quote the abstract:
States of a quantum mechanical system are represented by rays in a complex Hilbert space. The space of rays has, naturally, the structure of a Kähler manifold. This leads to a geometrical formulation of the postulates of quantum mechanics which, although equivalent to the standard algebraic formulation, has a very different appearance. In particular, states are now represented by points of a symplectic manifold (which happens to have, in addition, a compatible Riemannian metric), observables are represented by certain real-valued functions on this space and the Schrödinger evolution is captured by the symplectic flow generated by a Hamiltonian function. There is thus a remarkable similarity with the standard symplectic formulation of classical mechanics. Features---such as uncertainties and state vector reductions---which are specific to quantum mechanics can also be formulated geometrically but now refer to the Riemannian metric---a structure which is absent in classical mechanics. The geometrical formulation sheds considerable light on a number of issues such as the second quantization procedure, the role of coherent states in semi-classical considerations and the WKB approximation. More importantly, it suggests generalizations of quantum mechanics. The simplest among these are equivalent to the dynamical generalizations that have appeared in the literature. The geometrical reformulation provides a unified framework to discuss these and to correct a misconception. Finally, it also suggests directions in which more radical generalizations may be found.

The unique outcomes isn't itself problematic if the quantum phase space is addressed stochastically a la Nelson. John Baez addresses such things at length:
https://johncarlosbaez.wordpress.com/2018/12/01/geometric-quantization-part-1/
A. Neumaier said:
How? How then is Born*s rule derived from such a definition of measurement?
A completely straightforward pathway directly from mathematical physics based on the geometric framework is, for example, by utilizing the Fubini-Study metric in conjunction with the geometric hydrodynamic Madelung transform in order to obtain a derivation of the Born rule directly from first geometric principles; this derivation can then proceed a la Durr & Teufel resulting in the Born rule as a theorem.

Another suggested extension or alternate route of such a derivation based on the geometric framework was recently given by Lindgren & Liukkonen from an applied mathematics PDE theory viewpoint; they take as an ansatz the minimal expected action from (Nelsonian) stochastic control theory applied to the Telegrapher's equation, from which the Born rule naturally follows as a consequence.
 
  • #217
A. Neumaier said:
How should this settle the measurement problem? It still suffers from the unique outcome problem.
I completely forgot: deferring to a Nelsonian approach isn't even necessary here to resolve the problem with unique outcomes, because this specific problem has already been solved - even in the case of a single particle - using purely geometric methods:

The mathematical reason for unique measurement outcomes in single particle wavefunctions is due to the non-local nature of the system i.e. the presence of some cohomology element ##\eta##: for any sufficiently small open subregion ##G'## of a region ##G##, the cohomology element ##\eta## vanishes when restricted down to ##G'##. See this thread for elaboration and/or further discussion.
 
  • #218
Demystifier said:
Now after reading the excellent review http://de.arxiv.org/abs/0912.2560 I understand it much better. The Lagrangian density mentioned above really takes the form
$$ra\partial^{\mu}\bar{\psi} \partial_{\mu}\psi$$
where ##r## is a free dimensionless parameter. Through the loop corrections, this term generates a change of fermion mass of the order
$$\delta m\sim \frac{r}{a}$$
The problem is that this correction is big when ##a## is small, if ##r## takes a "natural" value ##r\sim 1##. To get right phenomenology one must take a much smaller value for ##r##, of the order of
$$r\sim ma$$
or less. But where does such a small number come from? This shows that the problem of chiral fermions on the lattice (with the Wilson term) is really a problem of naturality, known also as a hierarchy problem. The Standard Model of elementary particles has naturality/hierarchy problems even in the continuum limit (e.g. the scalar Higgs mass), and we see that lattice regularization by the Wilson term creates one additional problem of this sort.

But is naturality really a problem? The principle of naturality is really a philosophical problem, based on a vague notion of theoretical "beauty". Some physicists and philosophers argue that it is not really a problem at all
https://www.amazon.com/dp/0465094252/?tag=pfamazon01-20
https://link.springer.com/article/10.1007/s10701-019-00249-zSo if one accepts the philosophy that parameters in the Lagrangian which are not of the order of unity are not a problem, then there is really no problem of chiral fermions on the lattice with the Wilson term. @atyy I would appreciate your comments.

In these comments by Juven Wang and Xiao-Gang Wen, it seems that problems arise in specific constructions that try to match experiment (elsewhere I've seen Wen say that it is not chiral fermions per se, but the coupling with the non-abelian gauge field):

https://arxiv.org/abs/1809.11171
"There were many previous unsuccessful attempts, such as lattice gauge approach, Ginsparg-Wilson fermion approach, Domain-wall fermion approach, and Overlap-fermion approach. In the Ginsparg-Wilson fermion approach the to-be-gauged symmetry is not an on-site symmetry, and cannot be gauged. In the Domain-wall fermion approach, we have an extra dimension, where the dynamical gauge fields can propagate, which is inconsistent with experiments. The overlap-fermion approach is a reformulation of domain-wall fermion approach and also encounter problems."

https://arxiv.org/abs/1807.05998
"There were many previous attempts for the gauge chiral fermion problem. Lattice gauge theory approach [8] fails since it cannot produce low energy gauged chiral fermions [9]. The Ginsparg-Wilson (GW) fermion approach [10] has problems since the chiral symmetry [11] is realized as a non-on-site symmetry [12–17] and thus is hard to gauge. Domain-wall fermion approach [18, 19] also has problems, since after coupling to gauge fields, the massless gauge bosons will propagate in one-higher dimension. The overlap-fermion approach [20–25] is a reformulation of domain-wall fermion approach and face also some problems in a chiral gauge theory.

In the lattice gauge theory approach, the fermion interactions (except the gauge interaction) are ignored. In the mirror fermion approach proposed in 1986 [26–30], one started with a lattice model containing chiral fermions and a chiral conjugated mirror sector. Then, one includes proper direct interaction or boson mediated Swift-Smit interactions [31, 32] trying to gap out the mirror sector completely, without breaking the gauge symmetry and without affecting the normal sector. One proposed condition to gap out the mirror sector is that there are symmetric mass terms among mirror fermions and composite mirror fermions to give all the (composite) mirrorfermions a mass [26]. However, such a condition can be satisfied by U(1) anomalous 1+1D chiral mirror fermions which can never be fully gapped (see the arXiv version of Ref. 15). This means the [26]’s criteria is not sufficient enough to produce fully gapped mirror fermions. The follow-up work [33–36] failed to demonstrate that interactions can gap out the mirror sector without breaking the symmetry in some mirror fermion models. It was argued that “attempts to decouple lattice fermion doubles by the method of Swift and Smit cannot succeed [37]” and many people gave up the mirror fermion approach."
 
Last edited:
  • Like
Likes Demystifier
  • #219
atyy said:
In the Ginsparg-Wilson fermion approach the to-be-gauged symmetry is not an on-site symmetry, and cannot be gauged.
Note that Ginsparg-Wilson approach is not the same as Wilson approach. The Wilson approach (that I was referring to) does not have any gauging problems. The Ginsparg-Wilson approach is non-local, but the Wilson approach is local.
 
  • Like
Likes atyy

Similar threads

Replies
48
Views
12K
Replies
1
Views
1K
Replies
14
Views
4K
Replies
33
Views
4K
Replies
31
Views
5K
Replies
34
Views
4K
Back
Top