Wilsonian viewpoint and wave function reality

In summary: The stuff QFT is about: expectations of fields at a point and correlation functions, expectations of suitably (time- or normally) ordered products of fields at several points.
  • #1
atyy
Science Advisor
15,169
3,378
In the Wilsonian viewpoint, quantum electrodynamics is an effective theory, where the low energy predictions are obtained by coarse graining eg. some form of lattice QED where the lattice is taken very finely.

In the Copenhagen interpretation, we are agnostic as to whether the wave function is real, or for that matter whether observables are real. Only experimental results are real classical events, and theory only gives the probabilities of experimental results.

Does the Wilsonian viewpoint require taking the lattice to be real or classical, in the same way that Minkowski spacetime is considered real or classical in special relativistic quantum field theory?
 
Physics news on Phys.org
  • #2
I don't think so since the lattice regularization is only one possible regularization of QFT. You can as well use a momentum cutoff or a smooth cutoff function. So why should an arbitrary space-time lattice have any more "reality" than any other cutoff? Particularly QED is a Dyson-renormalizable theory, which means that for observables at low energies the high-energy scales don't play lot of a role. That's why you can take the cutoff to infinity at the prize of having to introduce a renormalization scale, upon which the renormalized coupling depends.
 
  • #3
Removing the cutoff (in the lattice case taking the continuum limit) is essential to get the correct Poincare symmetry. The approximate theories are only construction tools, not the real thing.
 
  • #4
vanhees71 said:
I don't think so since the lattice regularization is only one possible regularization of QFT. You can as well use a momentum cutoff or a smooth cutoff function. So why should an arbitrary space-time lattice have any more "reality" than any other cutoff? Particularly QED is a Dyson-renormalizable theory, which means that for observables at low energies the high-energy scales don't play lot of a role. That's why you can take the cutoff to infinity at the prize of having to introduce a renormalization scale, upon which the renormalized coupling depends.

I am not asking specifically about the lattice cutoff. I am asking about the idea of coarse graining. What is being coarse grained? Is the thing that is being coarse grained real?
 
  • #5
atyy said:
What is being coarse grained?
The stuff QFT is about: expectations of fields at a point and correlation functions, expectations of suitably (time- or normally) ordered products of fields at several points.

atyy said:
Is the thing that is being coarse grained real?
The sufficiently coarse-grained version is observable; nobody talks about what is real.

But there is no reason to suppose that this implies a lack of reality. Talk about reality is needed only when someone questions it.
 
Last edited:
  • Like
Likes vanhees71 and atyy
  • #6
Real is what is observable! So the answer is that the coarse-grained quantities that are really measured are real, what else?
 
  • #7
vanhees71 said:
Real is what is observable! So the answer is that the coarse-grained quantities that are really measured are real, what else?

But what is being coarse grained?

For example, in http://arxiv.org/abs/1502.05385 the renormalization gives a flow in the space of wave function and Hamiltonians. Are the flows of the wave function and Hamiltonians just tricks, and is it always more fundamental to conceive the flow in the space of correlation functions, as A. Neumaier says?
 
Last edited:
  • #8
What's being coarse-grained are the microscopic degrees of freedom, we even might not know with our contemporary experimental abilities. That's the great point of the renormalization group in Wilson's interpretation of it: It doesn't matter what the "underlying" microscopic degrees of freedom really are, but we can use QFT as an effective description of what's observable or better said resolvable with our detectors. If you look at, say, a proton at very low energies it's pretty well described as a heavy point particle or as a Coulomb field concerning the electrons surrounding it building an atom. If you start scattering electrons on it, you'll find that it is in fact an extended object and you can characterize it with a form factor, relating to a charge distribution of an extended object. If you enhance the energy even further you start to resolve the constituent valence quarks and even the sea quarks and gluons etc. Who knows, whether this is the final answer to what a proton really is, but we can describe at the so far accessible energies (or resolutions) the observations with regard of what we call a proton. I'd not dare to say that we understand protons (or any other hadron) fully, but only within some "blurred" picture limited by the resolution of our probes and detectors, and that observables are real. Anything else we might think about "what's coarse-grained" is speculation as long as it isn't observable and thus it's good to stick to the rule that "real is what's observable", and that might change when we get refined measurement devices and that's why the picture about what a proton is changed from, say, Rutherford to today a great deal, and I guess it will change also great deal in the future :-).
 
  • #9
vanhees71 said:
the coarse-grained quantities that are really measured are real, what else?
Are quarks real? Measured are only the cross sections of leptons and hadrons...
 
  • #10
Quarks are real. The only question is what you call quarks. Quarks are observable, e.g., in the deep inelastic scattering, demonstrating Bjorken scaling and its violation, which is well explained by QCD (DGLAP equation etc.). For sure what's not directly observable are the quanta of the quark fields occurring in the QCD Lagrangian, because these "current quarks" do not occur as "asymptotic free states" in our detectors. This holds also for their properties like the current-quark masses, which are not directly observable but inferred from theory as parameters entering the standard-model Lagrangian (in terms of Yukawa Couplings of the quarks to the Higgs field).
 
  • #11
atyy said:
But what is being coarse grained?

For example, in http://arxiv.org/abs/1502.05385 the renormalization gives a flow in the space of wave function and Hamiltonians. Are the flows of the wave function and Hamiltonians just tricks, and is it always more fundamental to conceive the flow in the space of correlation functions, as A. Neumaier says?
Coarse-graining always means restricting the algebra of observables to a smaller one consisting of fewer and/or more averaged quantities. This is visible in the most general form of the projection operator formalism that is the basis of all reduced descriptions derived with sufficient care. Instead of a big algeba of observables evolving in the Heisenberg picture on the fixed state of the universe, one only considers an algebra of relevant observables, and restricts the state of the universe to this effective algebra.

In the tensor network paper you linked to, the state of the ''universe of interest'' is the ground state of a spin system, and the effective algebra is the algebra of linear operators preserving a subspace of the Hilbert space, the reduced Hilbert space selected by the tensor network structure. The state of the universe restricted to this algebra is a state of this reduced Hilbert space - in general a mixed state of the reduced system but approimated by a pure state (to be determined by a variational process).

In Wilson's view, one has a parameterized family of algebras, labelled by a scale parameter ##\Lambda## inducing a labelling of all observables ##A(\Lambda)##. Each of these can be obtained by some explicit expression in the observables ##A(\Lambda')## for some or all ##\Lambda'>\Lambda##, which defines the embedding of the algebras in each other. Wilson renormalization is the fact that the temporal dynamics of the ##A(\Lambda)## can be approximately described by an effective Hamiltonian ##H(\Lambda)## for which one can derive a renormalization group equation.
 
  • Like
Likes atyy
  • #12
vanhees71 said:
Quarks are observable, e.g., in the deep inelastic scattering
On the surface, only deep elastic scattering is observable, and explained by the unobservable quarks.
 
  • Like
Likes vanhees71
  • #13
True, but then you can also say the same for particles that exist as asymptotic free states like electrons. We never see "electrons" in this sense but only their manifestations in interacting with macroscopic matter, like in a discharge tube by the glow of the rest gas ionized by them or as dots on an old-fashioned TV screen or as pixel signals stored on a hard disk at CERN etc.
 
  • #14
vanhees71 said:
True, but then you can also say the same for particles that exist as asymptotic free states like electrons. We never see "electrons" in this sense but only their manifestations in interacting with macroscopic matter, like in a discharge tube by the glow of the rest gas ionized by them or as dots on an old-fashioned TV screen or as pixel signals stored on a hard disk at CERN etc.
Indeed. That's why the question of what exists is a nontrivial one. Some argue for the existence of virtual photons by saying that we see their manifestations. So one needs more definite criteria for what is real.

My proposal for defining ''being real'' is ''having a Heisenberg state'' - see my new Insight Article. Having a state means having expectations and correlation functions, just as required above.
 
  • #15
The Insight Article is great. Now we have it to link to whenever somebody asks about "virtual particles" :-)).

One should also note, and that many professionals are not aware of, that the only clear definition of a resonances mass and width is given as the pole of the S-matrix element of a specific process. Even an apparently "simple" resonance as the ##\rho## meson is only well defined when this is kept in mind, and that's why in the Review of Particle Physics it's very well defined what is used to determine its mass and width in the tables, namely via ##\mathrm{e}^+ + \mathrm{e}^- \rightarrow \pi^+ + \pi^-## and (consistently) ##\tau \rightarrow 2\pi+\nu_{\tau}##. Already in the Dalitz decays of the pseudo-scalar mesons ##\eta## and ##\eta'## lead to differences if the proper dynamics is not taken into account (not to talk about the Dalitz decays of baryon resonances, where the ##\rho## meson appears in the vector-meson resonance model for the corresponding transition-form factors, which turns out to be not too bad a model). But that's a bit off-topic here.

I've not found an explanation/definition of what you mean by "Heisenberg state" in your Insight article. It may also be good to reformulate the definition of "state" to: "A quantum state is represented by a trace-class positive semidefinite self-adjoint operator with trace 1." That holds for both pure and mixed states. The pure states are exactly represented by statstical operators that are projectors, i.e., obeying ##\hat{\rho}^2=\hat{\rho}##. Another equivalent definition is that a pure state is represented by rays in Hilbert space, but this somewhat cumbersome definition is avoided by using statistical operators for both pure and mixed states.
 
  • #16
vanhees71 said:
The Insight Article is great. Now we have it to link to whenever somebody asks about "virtual particles" :-)).
This and a version of the remainder should be posted in the other thread, not here.

vanhees71 said:
I've not found an explanation/definition of what you mean by "Heisenberg state" in your Insight article.
There it doesn't matter. In the present context, it is just a state in the Heisenberg picture, so that time correlations make sense.
vanhees71 said:
It may also be good to reformulate the definition of "state" to: "A quantum state is represented by a trace-class positive semidefinite self-adjoint operator with trace 1." That holds for both pure and mixed states. The pure states are exactly represented by statstical operators that are projectors, i.e., obeying ##\hat{\rho}^2=\hat{\rho}##.
This would make it worse - not in the sense of incorrect but in the sense of adding technicalities that only detract from the real message.

Semidefinite already implies Hermitian, trace 1 already implies trace-class, and Hermitian trace-class operators are automatically self-adjint. Thus there is no need to introduce these technical terms. Pure states are only those projectors ##\rho## that project to a ray. Of course the trace is in this case the dimension of the target space, so your definition is ok. But these technicalities are also not needed for the insight article since I never do anything with the states.
 
  • #17
A. Neumaier said:
Removing the cutoff (in the lattice case taking the continuum limit) is essential to get the correct Poincare symmetry. The approximate theories are only construction tools, not the real thing.
How do you know that Poincare symmetry is the real thing? Because we observe it in the large distance limit? So what, all those lattice theories have no problem to recover Poincare symmetry in the large distance limit.

I would say if Poincare symmetry would be the real thing, one would be able to construct models without infinities which have full Poincare symmetry, and one would not need such non-Poincare-symmetric "construction tools".
 
  • Like
Likes king vitamin
  • #18
Ilja said:
if Poincare symmetry would be the real thing, one would be able to construct models without infinities which have full Poincare symmetry, and one would not need such non-Poincare-symmetric "construction tools".
Your argument is very misinformed.

It is typical in mathematics that nice objects are first constructed in a messy way.
The real numbers have very nice properties but to construct them one needs artifacts that make the numbers appear to be complicated sets (Dedekind cuts, euqivalence classes of Cauchy sequences, etc.).
The exponential function has many nice properties, but to construct it one needs limits of simpler functions that do not have this property.

Another reason why Poincare symmetry is the real thing is that it gives rise (via Noether's theorem) to the conservation laws on which all basic physics relies. Drop symmetries - and you have nothing left to guide your theory building.
 
  • Like
Likes vanhees71
  • #19
A. Neumaier - is your answer different from vanhees71's?

Your answer is that the correlation functions coarse grained, whereas vanhees71 says the degrees of freedom are coarse grained.

Your answer is what I would expect within Copenhagen (expectations are real in Copenhagen), whereas vanhees71's doesn't (quantum degrees of freedom are not real, presumably photons and electrons are not necessarily real in QED).
 
  • #20
A. Neumaier said:
It is typical in mathematics that nice objects are first constructed in a messy way.
But I was talking about physics. Your example is also not very impressive. Essentially, it is an example of a simplification reached by going to some infinite limit. So that a mathematical simplification can be reached by going to limits. But what about reality?
A. Neumaier said:
Another reason why Poincare symmetry is the real thing is that it gives rise (via Noether's theorem) to the conservation laws on which all basic physics relies. Drop symmetries - and you have nothing left to guide your theory building.
Hm. Maybe you are an opponent of GR, given that you cannot get conservation laws via Noether, but have only some pseudotensors, which do not even allow a physical interpretation in agreement with the spacetime interpretation, which allows only tensors to have a physical meaning?
 
  • #21
Let's discuss the Poincare thing somewhere else - I don't think it is very relevant - and we are already discussing lattice renormalization.
 
  • #22
atyy said:
A. Neumaier - is your answer different from vanhees71's?

Your answer is that the correlation functions coarse grained, whereas vanhees71 says the degrees of freedom are coarse grained.
Both statements are related, convenient intuitions but (like all intuitions) a bit imprecise.

In my second answer I was more precise and said that the system description itself (i.e., which observables are kept in the description) is coarse-grained. This covers both points of view and is the most general form of coarse-graining. It also covers coarse-graining by classical approximation of the quantum system (if one keeps only a commuting family of observables such as in the derivation of hydrodynamics from QFT).

All this is discussed very well in the book by Grabert on operator projection techniques.

The degrees of freedom - usually counted as dimension of the phase space (number of parameter needed for the relevant coherent states) - it is an imperfect measure of coarse-graining. Truncating an oscillator to its lowest two levels is a form of coarse-graining (relevant for quantum optics) but there is no decrease of the number of degrees of freedom which is 2 in both cases (phase space ##R^2## or ##C## for an oscillator, the Bloch sphere for a 2-level system).

Coarse-graining in a field theory by smoothing also preserves the infinite number of degrees of freedom. Only in case of coarse-graining with a discontinuous cutoff can one truly say that the number of degrees of freedom has been reduced (since there is a meaningful basis of which the part with the highest momenta is discarded), though this number remains infinite.
 
Last edited:
  • Like
Likes atyy
  • #23
atyy said:
Let's discuss the Poincare thing somewhere else - I don't think it is very relevant - and we are already discussing lattice renormalization.
It is now here for further discussion.
 
  • #24
atyy said:
In the Wilsonian viewpoint, quantum electrodynamics is an effective theory, where the low energy predictions are obtained by coarse graining eg. some form of lattice QED where the lattice is taken very finely.

In the Copenhagen interpretation, we are agnostic as to whether the wave function is real, or for that matter whether observables are real. Only experimental results are real classical events, and theory only gives the probabilities of experimental results.

Does the Wilsonian viewpoint require taking the lattice to be real or classical, in the same way that Minkowski spacetime is considered real or classical in special relativistic quantum field theory?
Is it accurate to say the Wilsonian Lattice Ansatz/hypothesis (minimally) implies...
1) An unit or "step": The vertex and edge building blocks.
2) A recursion of that "unit" or step: The process by which a periodic lattice of some pattern of those units emerges.
?

Am I making sense when I ask if there is a relationship between a multi-fractal and a RG family of algebras?

Looking at Garret Lisi's E8 graph - even if that turns out to not to be right - won't whatever theory is more right also have to have strong, periodic, complex self-similarity under scaling?

Can anything except some kind of unit recursion (fractals etc) do that?
Doesn't a multi-fractal process also possibly explain the probabilistic but also deterministic nature of the fine-scale horizon?

Going one step further - doesn't recursion imply there has to be a unit (or units) with specific scale to recurse?
If so that seems like an assertion about reality that includes the horizon and whatever contains it o_O.
 
  • #25
https://www.fractalus.com/sylvie/lsmultifrace.htm

In case you thought I made the term "multi-fractal" up this is from the website linked above. I believe the artist's name is Lee Skinner. I hope it is okay to post it for this purpose. I think it's pretty amazing. And I think it shows how multi-fractals can create worlds with apparent algebraic relationships from relatively much simpler rules - under recursion.

Doesn't this support a reasonable hypothesis that there is a fundamental reality that is lattice-like (recursion of small simple processes). What else can do this with greater deference to Occam's Razor? And doesn't Occam's Razor require selection of this as more primary than the post-hoc application of complicated families of algebras relating emergent structures?

Some might say- well it's just a simulation. But I don't see any justification for bias against a fundamental theory that is inherently non-linear - and so can only be observed as deterministic chaotic sequences. Frustrating yes, because it defies generalization to some degree, but who said physical truth had to be comfortingly complicated.

Some might say, "what good is it?" I need to calculate some photons - and chaos is not at all helpful. Fair enough. Good thing we invented practical algebras. But I just don't see anything that has as much potential to describe simply and fully how the world got made - and in doing so illuminate a little bit, the properties involved. Plus, if we get good and analyzing these damn things who knows how that might become useful?

Is this just common knowledge and I'm foolish for getting excited thinking about it, or does it just not make any sense at all?
I was kind of hoping Manfred Schroeder's book on Fractals, Chaos, and Power Laws counted as substantial reading? I have loved that book. I'm stunned by it.

6KMAN024-Many_Worlds.jpg
 
  • #26
atyy said:
In the Wilsonian viewpoint, quantum electrodynamics is an effective theory, where the low energy predictions are obtained by coarse graining eg. some form of lattice QED where the lattice is taken very finely.

In the Copenhagen interpretation, we are agnostic as to whether the wave function is real, or for that matter whether observables are real. Only experimental results are real classical events, and theory only gives the probabilities of experimental results.

Does the Wilsonian viewpoint require taking the lattice to be real or classical, in the same way that Minkowski spacetime is considered real or classical in special relativistic quantum field theory?
The Wilsonian method is usually used in quantum physics, but the basic idea of the method does not depend on quantum physics. Therefore, to decouple reality issues of the Wilsonian method from the (more difficult) reality issues of quantum physics, it is useful to think of Wilsonian method in classical physics.

From the classical perspective the answer to your question should be conceptually clear, even if it is not easy to verbalize it. To make it even more vivid, let me use an example from biology (rather than pure boring physics). In population biology the smallest unit is one animal (say a wolf), in biological anatomy the smallest unit is one organ (say heart), in microbiology the smallest unit is one cell, in molecular biology the smallest unit is one organic molecule. These different viewpoints of biology are nothing but different "Wilsonian coarse grains". Are the corresponding "smallest biological units" real? Whatever your answer is, the same answer can also be applied to pure (and boring) physics.
 
  • Like
Likes Jimster41, ShayanJ and vanhees71
  • #27
Demystifier said:
The Wilsonian method is usually used in quantum physics, but the basic idea of the method does not depend on quantum physics. Therefore, to decouple reality issues of the Wilsonian method from the (more difficult) reality issues of quantum physics, it is useful to think of Wilsonian method in classical physics.

From the classical perspective the answer to your question should be conceptually clear, even if it is not easy to verbalize it. To make it even more vivid, let me use an example from biology (rather than pure boring physics). In population biology the smallest unit is one animal (say a wolf), in biological anatomy the smallest unit is one organ (say heart), in microbiology the smallest unit is one cell, in molecular biology the smallest unit is one organic molecule. These different viewpoints of biology are nothing but different "Wilsonian coarse grains". Are the corresponding "smallest biological units" real? Whatever your answer is, the same answer can also be applied to pure (and boring) physics.

In classical physics the smallest units are real. But in quantum mechanics, it is not so clear, is it? For example, are the degrees of freedom in QM real? Are observables in QM real? Or are only expectation values real?
 
  • #28
atyy said:
In classical physics the smallest units are real. But in quantum mechanics, it is not so clear, is it? For example, are the degrees of freedom in QM real? Are observables in QM real? Or are only expectation values real?
What is your definition of being real? Without such a definition your question cannot be answered, or the answer is arbitrary, defining realness in different ways.
 
  • #30
I'd very pragmatically say that real is what's observable, and of course observables are observable. An observable is defined by an equivalence class of measurement procedures and a state as an equivalence class of preparation procedures. A state of a given system can be determined on a sufficient set of measurements on an ensemble of equally prepared such systems (see Ballentine's book for details).

Now you can use various mathematical models to describe the relations between observables, which we call natural laws. These can be classical or quantum. The mathematical objects (phase-space coordinates and functions thereof in the classical the Hilbert space vectors and operators acing on them in the quantum case) are NOT the observables or states but a description thereof.

Defining observables in the operational sense given above needs theory, but the theoretical abstract objects are not the observables, and measuring observables means to verify the consistency of the definition in different situations.

With this pragmatic picture a lot of the socalled interpretational problems of quantum theory become obsolete from a physicist's point of view.
 
  • #31
If we simplify different physical models by introducing the same common element then this element is real. It's like with objects in our common experience: if we can perceive an object with different senses it's real. Don't know if this definition is helpful in current discussion.
 
  • #32
atyy said:
In classical physics the smallest units are real. But in quantum mechanics, it is not so clear, is it? For example, are the degrees of freedom in QM real? Are observables in QM real? Or are only expectation values real?
Sure but my point is that, concerning quantum reality, there is nothing special about Wilsonian objects. Once you choose your favored interpretation of QM, the reality status of Wilsonian objects in that interpretation becomes obvious.
 
  • #33
Demystifier said:
Sure but my point is that, concerning quantum reality, there is nothing special about Wilsonian objects. Once you choose your favored interpretation of QM, the reality status of Wilsonian objects in that interpretation becomes obvious.

But if one chooses to conceive Wilson as coarse graining degrees of freedom, yet conceive the degrees of freedom as not real, then why would we accept Wilson's explanation as a "physical" explanation for why renormalization works?

So it appears that if we use Copenhagen, we are not allowed to conceive of Wilson as coarse graining observables or degrees of freedom, which seems quite different from the Wilsonian spirit.
 
  • #34
atyy said:
But if one chooses to conceive Wilson as coarse graining degrees of freedom, yet conceive the degrees of freedom as not real, then why would we accept Wilson's explanation as a "physical" explanation for why renormalization works?

So it appears that if we use Copenhagen, we are not allowed to conceive of Wilson as coarse graining observables or degrees of freedom, which seems quite different from the Wilsonian spirit.
If you take Copenhagen seriously and conceive that degrees of freedom are not real, then Wilson coarse graining is also not real. It is just a calculation tool. A very useful tool. Which, for someone who takes Copenhagen seriously, should be enough.

Is the tool "physical"? Yes, if you are persistent in taking Copenhagen seriously. Bohr said that the task of physics is not to find out how nature is, but what we can say about nature. So if Wilson coarse graining helps you to say something about nature, then, according to Copenhagen, it's physical.

Is it consistent with the Wilsonian spirit? Probably not, for Wilson himself was probably not someone who was taking Copenhagen very seriously. But people can use the same tools even when they have different spirits. (For instance, using a computer not for computing but for discussions on the forum is not in the original spirit of the idea of computer.)
 
Last edited:
  • #35
Dear Atvy.

As you know, was formulated in 1926 by Erwin Schroedinger, a partial differencial equation that describes how the quantic state of a physical system changes with time. For it, in 1933, he received the Nobel Prize (together with Paul Dirac).

It contains the factor Ψ, referred somewhat improperly as "wave function". The significance of it was not understood, until Max Born interpreted it as defining the probability of finding a particle in a determinate position of space. He received the Nobel Prize for it in 1932. The possibility can be represented by a Gauss curve, with maximum in the center and coming asymptotically to zero in the extremities. The mathematical formalism adopted leaves clear that in the instant the location of the particle is made, all probabilities disappear. Strangely, since the formulation to this day, numerous discussions about the significance of this disappearance occur, maintaining that there is something misterious in it (Copenhagen interpretation). Nevertheless, when we have a dice in hand before we throw it the possibility of each face falling upside is one to six. In the moment it falls upon the table and immobilize, to me it's clear one can no more speak of probabilities, as one of the faces was defined. Its obvious, there is nothing misterious in it, as even Einstein and Niels Bohr concurred. A supposed “observator's influence” is therefore nonsense.

It's what occurs when one imagines that Physics necessarily must be described by mathematical formulas, even when they are not needed, as is the case. In this love by mistery, even today is frequent the understanding that the wave function signifies that the particle is in all places at the same time, and quantic theory would make possible the creation on a computer capable of realizing simultaneously infinite mathematical operations, a thing that would be useful, for instance, in breaking cryptographed texts.

Another common mistake that has the same origin consists in "multiple universes interpretation", that erroneously affirms the objective reality of the universal wave function.

Fernando Arthur Tollendal Pacheco - fernandoarthur@gmail.com
Brasilia (DF) - Brazil
 

Similar threads

Replies
9
Views
976
Replies
33
Views
2K
Replies
33
Views
2K
Replies
1
Views
871
Replies
8
Views
2K
Replies
36
Views
4K
Back
Top