Penrose's argument that q.g. can't remove the Big Bang singularity

In summary, the conversation discusses Penrose's observations on the Weyl curvature hypothesis and the evolution of the universe, the possibility of quantum gravity removing singularities, and the issue of the big bang singularity in loop quantum cosmology. The possibility of a creator is also brought up, but ultimately the conversation ends with the idea that something drastic must happen to space-time at the big bang and loop quantum cosmology may not be a viable approach.
  • #36
tom.stoer said:
... and I think there is another issue: the LQC models always have finitely many gravity and matter d.o.f. so they are always in a pure state and have entropy zero
That is irrelevant. Even if the number of dof's is infinite, the whole universe should be in the pure state, and therefore its von Neumann entropy is zero and does not change with time. But von Neumann entropy is not the only meaningful entropy in QM. The entropy one expects to increase with time necessarily involves some kind of COARSE GRAINING, corresponding to the inability to see all the details of a complex system. This means that some microscopically distinct states are viewed as macroscopically identical, so entropy will increase in the sense that the system will evolve towards macro-states that can be realized by a larger number of different micro-states.
 
Physics news on Phys.org
  • #37
Demystifier said:
For an explicit example of a numerical simulation (not really a bouncing universe, but a system with a minimal entropy at one particular time) see e.g. Fig. 4 in
http://arxiv.org/abs/1011.4173v5 [Found. Phys. 42, 1165-1185 (2012)]

Nice! But their discussion assumes an interaction, whereas we know that different regions of the universe are causally disconnected.

It seems like all we know fundamentally is that:

(1) If there is low entropy in some region, the most likely thing is that it's preceded and followed by higher entropy.
(2) Regions of spacetime that are causally connected should have arrows of time that agree.
(3) Our universe is nearly flat, so it's probably either spatially infinite or at least much larger than the observable universe.
(4) Our own past light cone appears to have had an arrow of time going back to at least the era of big bang nucleosynthesis. This requires extreme fine-tuning.
(5) Our own thermodynamic arrow of time currently points away from the big bang.

Based on these observations, it seems to me that the most general picture we can construct is a universe in which nearly all of spacetime is in thermal equilibrium, but there is one or more causally separated islands that are not at equilibrium. In each of these islands, there is some time of minimum entropy, which may or may not coincide with the big bang (or bounce). In our own island, this time seems to have been either at the big bang or before nucleosynthesis.

It probably doesn't make sense to explain our island, or any others that exist, as thermal fluctuations, because then it would be overwhelmingly more probable for us to be Boltzmann brains rather than real observers inhabiting a large, non-equilibrium region of spacetime. This means that extreme fine-tuning is required. We have no physical law or principle that explains this fine-tuning, so we can't say whether (1) there should be more than one island, (2) an island's minimum entropy should coincide with the big bang, or (3) our own island actually encompasses the whole universe. In principle we can test all three of these empirically, but 1 and 3 can only be tested by waiting for cosmological lengths of time and continuing to make observations. 2 can be tested in our own island by seeing whether cosmological models correctly explain the very early universe with the normal second law of thermodynamics.

I don't think a bounce really changes this picture very much. Penrose's argument seems to be based on the assumption that the second law is fundamental and universal, but that doesn't seem to me like a natural point of view.
 
Last edited:
  • #38
marcus said:
Black hole collapse has also been studied in the LQG context--that is very different. In a BH collapse, there is some MATTER that collapses, but the surrounding space does not. In a LQC cosmological collapse the whole of space collapses and rebounds. I'm sure you are well aware of the difference.

The difference has nothing to do with the presence of some surrounding space that doesn't collapse. The difference is in the symmetries and reduction to a finite number of degrees of freedom in LQC.
 
  • #39
Demystifier said:
...models for a bouncing Universe I have ever seen involve a rather SMALL number of the degrees of freedom. ...

Then please see for example the paper I just referred to in my last post (#34)---Agullo Ashtekar Nelson November 2012.
Infinite dimensional Hilbert space of states. Basically defines the new face of LQC.

Anyone at all interested in bounce cosmology (the bulk of that being by the LQC community) should probably memorize the arxiv number 1211.1354 and spend some time reading the paper. It's 50 pages. They have a followup/companion paper in prep.
"[2] I. Agullo, A. Ashtekar and W. Nelson, The pre-inflationary dynamics of loop quantum cosmology: Confronting quantum gravity with observations, (in preparation)"
 
Last edited:
  • #40
Finbar said:
I think we are missing the essential point here. The idea of LQC with a bounce is that the universe at some point collapsed under it's own gravity bounced and formed a big bang. Is this right? On the other hand when matter collapses from some generic initial conditions we expect it will form a black hole and ultimately matter will be compressed to Planckian densities. Even if at this point QG kicks in and the singularities are removed it won't lead to a state anything close to the unique state needed form a big bang.
...

But it DOES lead, both with extensive equation modeling and with numerical simulation. To adequate start of big bang and sufficient inflation with high probability and no fine-tuning. See Ashtekar's paper on the probability of inflation within LQC context.

marcus said:
Well F. it sounds like your word against the equations and your word against the computer.

The LQC bounce has been both reduced to equations and simulated numerically many times with lots of variations---with anisotropy with perturbations with and without inflation. The upshot is that the "big crunch" collapse of a spatially finite classical universe typically DOES lead big bang conditions and an expanding classical universe. The result is remarkably robust---the people who do the modeling do not find there is a need for fine-tuning.

This is not to say that Nature IS this way. What it says is that in this theoretical context with this version of quantum cosmology a big crunch tends to rebound in a big bang fairly robustly.

Black hole collapse has also been studied in the LQG context--that is very different. In a BH collapse, there is some MATTER that collapses, but the surrounding space does not. In a LQC cosmological collapse the whole of space collapses and rebounds. I'm sure you are well aware of the difference.
...

My mention of the difference between BB and BH is just a footnote to my general rebuttal of what you said. I'm sure you are well aware of the difference. You seem knowledgeable--or IMHO have written knowledgeably about this stuff in the past---but now seem out of touch with current research.
Finbar said:
The difference has nothing to do with the presence of some surrounding space that doesn't collapse. The difference is in the symmetries and reduction to a finite number of degrees of freedom in LQC.

I'm not sure what the "difference" is that you are saying has nothing to do with this or that, but you are not up-to-date regarding "finite number of degrees of freedom". See 1211.1354
 
Last edited:
  • #41
Demystifier said:
That is irrelevant. Even if the number of dof's is infinite, the whole universe should be in the pure state, and therefore its von Neumann entropy is zero and does not change with time...

Demy, that is very interesting! You could be talking about the new LQC+Fock hybrid with infinite d.o.f. where the bounce is highly robust and does not require fine tuning. Why then do you say the U should be in a pure state!

I am used to seeing "squeezed" states in the LQC literature, it seems routine to employ mixed states in bounce cosmology analysis. What am I missing? Is there some PHILOSOPHICAL reason you have in mind for why the U should be in a pure state?

Correct me if I am wrong but I think the U can ONLY be in a mixed state simply because no observer can see all of it :biggrin: Cosmologists have this distance called "particle horizon" estimated at 46 billion ly which is the distance to the farthest matter which we could in principle have gotten a signal from. But the whole thing (assuming finite) is estimated to be at least several times larger.

You know the Bohr proverb about phyiscal science: it's not about what IS but instead what we can SAY about it. In that spirit, all we can say is a mixed state. And that therefore is the state.

So your statement that the U must be in a pure state is really interesting to me and I wish you would explain.
 
  • #42
Loop is transforming and there's a short list research fronts to watch, where the change is happening. Thanks Ben C, Demy, Tom, Finbar for starting and/or contributing to this discussion which has underscored the importance of the hybrid LQC work by Agullo Ashtekar Nelson! I definitely have to add it to the watch list. Several of the following could turn out to be among the most important QG research papers of 2012:
hybrid LQC
An Extension of the Quantum Theory of Cosmological Perturbations to the Planck Era (1211.1354)
The pre-inflationary dynamics of loop quantum cosmology: Confronting quantum gravity with observations (in prep)
GR Thermo
General relativistic statistical mechanics (1209.0065)
Horizon entanglement entropy and universality of the graviton coupling (Bianchi's ILQGS talk and 1211.0522)
tensorialGFT (Carrozza's ILQGS talk and 1207.6734)
holonomySF (Hellmann's ILQGS talk and 1208.3388)
twistorLQG (Speziale's ILQGS talk and 1207.6348)
dust (Wise's ILQGS talk and 1210.0019)
 
Last edited:
  • #43
marcus said:
Correct me if I am wrong but I think the U can ONLY be in a mixed state simply because no observer can see all of it

Isn't it rather that the universe at the big bang can be viewed as being in a pure state since all the matter was in causal contact. Then once the universe evolves we can only consider "observable" universes which are subsystems of the whole system. Theses observable universes are described as mixed states obtained by tracing over the unobservable universe. So while it is correct to say that the current observable universe is a mixed state the universes at the big bang can be considered a pure state.

So maybe I'll do a u-turn and tentatively buy the "bounce" cosmology. We could think that the state of the universe at the bounce is a pure state. Then the universe evolves to the current day at which point each observer can only see a finite amount of the universe which will be described as a mixed state. Finally the universe then collapses at which point all the universe comes back together and a pure state is again recovered. Now the issue is why the final pure state would look anything like the initial pure state.
 
  • #44
Demystifier is right that a system in a pure state with finitely (or infinitly many) d.o.f. will remain in a pure state under (unitary) time-evolution and will therefore never have entropy > 0.

Pleae note that the underlined words are not well-defined or not known in general in the case of QG ;-)
 
  • #45
Finbar said:
...
So maybe I'll do a u-turn and tentatively buy the "bounce" cosmology. ... Now the issue is why the final pure state would look anything like the initial pure state.

There are certainly some extremely fascinating unresolved issues!
I can only speculate as a non-expert. My intuition says that a collapsing phase of the U filled with a gas of black holes would reach (assuming the LQC model where gravity becomes repellent at high density) a stage where the horizons of all BH rapidly shrink, releasing Hawking radiation.

So what goes into the bounce is a universe full of Planck-scale gamma photons and Planck-scale black holes, which can interconvert.

The picture is somewhat analogous to a pair-instability supernova where the photons have become so energetic they become indistinguishable from electron-positron pairs. But that is only a crude analogy. I am groping for a picture of what it could be like when radiation and geometry interconvert the one form of energy into the other and back again, very rapidly, at Planck scale.

Just a speculative picture FWIW.

If I were Ivan Agullo or Eugenio Bianchi or Bill Nelson (who has taken a postdoc at Nijmegen in the Netherlands and will be giving talk(s) at Stockholm this month, I gather) I think I would be working towards a model of the bounce of a classical universe which collapses into a Planck soup of that kind of stuff (maybe :biggrin:) and bounces.

The one thing that Agullo Ashtekar Nelson say in 1211.1354 is that their followup paper is based on numerical simulations of the bounce (with perturbations) in a hybrid LQC-Fock picture. They stress the numerical simulations. that rings a bell with me. The results should be very very interesting even if one only tentatively semi-accepts the preliminary model.
 
  • #46
marcus said:
I am used to seeing "squeezed" states in the LQC literature, it seems routine to employ mixed states in bounce cosmology analysis. What am I missing?
From that sentence it looks as if you are missing the fact that squeezed state is a pure state, not a mixed state.

marcus said:
Is there some PHILOSOPHICAL reason you have in mind for why the U should be in a pure state?
Strictly logically the Universe does not necessarily need to be in a pure state, but in the theory of decoherence it is usually assumed so. If you are not familiar with the basic ideas of decoherence, see e.g.
http://arxiv.org/abs/quant-ph/9803052
for a brief introduction.
 
  • #47
Demystifier said:
From that sentence it looks as if you are missing the fact that squeezed state is a pure state, not a mixed state.


Strictly logically the Universe does not necessarily need to be in a pure state, but in the theory of decoherence it is usually assumed so. If you are not familiar with the basic ideas of decoherence, see e.g.
http://arxiv.org/abs/quant-ph/9803052
for a brief introduction.

Thanks for the correction! I'm also used to seeing peaked semiclassical states in LQG, are these also pure?
I'm glad that you find that it is not strictly logically necessary for the U to be in a pure state!
I will look at your link.

Ah, Claus Kiefer! He had an article just recently about decoherence in LQG, I'll get the link.
The quantum state of geometry decoheres through interaction with FERMIONS. The triad variable of LQG chooses an orientation, is forced by the presence of fermions to classicalize.
http://arxiv.org/abs/1210.0418
Interpretation of the triad orientations in loop quantum cosmology
Claus Kiefer, Christian Schell
(Submitted on 1 Oct 2012)
Loop quantum cosmology allows for arbitrary superpositions of the triad variable. We show here how these superpositions can become indistinguishable from a classical mixture by the interaction with fermions. We calculate the reduced density matrix for a locally rotationally symmetric Bianchi I model and show that the purity factor for the triads decreases by decoherence. In this way, the Universe assumes a definite orientation.
12 pages, 1 figure

I don't remember if you already commented on this paper (it came up in another thread.)
If you did I'd like very much to see your comment and would appreciate a link to your post about it. If you haven't yet I hope you will. It's interesting to see Kiefer focusing on one of the outstanding problems in LQG the orientation symmetry of the main variable (not present in theories based on the metric).

My take on it is that when Kiefer or others start with the U in a pure state and have it progressively decohere, this does not mean that in reality the U would necessarily have to start pure. The analysis just shows how it could start in a purER state and become LESS pure.
The analysis is a fortiori. It is just a convenient simplification to imagine that the system starts in a pure state, the important thing is progressive decoherence starting from whatever level of (im)purity or mixedness. I can imagine you might disagree.
 
Last edited:
  • #48
marcus said:
Thanks for the correction! I'm also used to seeing peaked semiclassical states in LQG, are these also pure?

Pure states are states that can represented as state vectors in the Hilbert space.
 
  • #49
Finbar said:
Pure states are states that can represented as state vectors in the Hilbert space.
I believe that's right, Finbar. And mixed states are probabilistic superpositions of pure states, are they not?
What I'm unsure about is the meaning of "squeezed" states. Do you have a good brief explanation, or a link for that?

If not, it probably is just a side issue, so no matter.

What interested me just now was Demy's mentioning decoherence and the work of Claus Kiefer, a longtime central figure in Quantum Gravity. Just last month Kiefer posted this paper on decoherence in LQG. I'd really appreciate your comments!

http://arxiv.org/abs/1210.0418
Interpretation of the triad orientations in loop quantum cosmology
Claus Kiefer, Christian Schell
(Submitted on 1 Oct 2012)
Loop quantum cosmology allows for arbitrary superpositions of the triad variable. We show here how these superpositions can become indistinguishable from a classical mixture by the interaction with fermions. We calculate the reduced density matrix for a locally rotationally symmetric Bianchi I model and show that the purity factor for the triads decreases by decoherence. In this way, the Universe assumes a definite orientation.
12 pages, 1 figure

It seems that purity and mixedness are not absolute properties but are on a range. Maybe all states should be thought of as a density matrix rho and the degree of purity would be the trace of the square of rho.
==quote page 7 Kiefer Schell==
A measure for the purity of the total state (15) is the trace of ρred2, which is equal to one for a pure state and smaller than one for a mixed state; it is directly related to the linear entropy Slin = 1 − ρred2 [5]. One could also discuss the von Neumann entropy −kBtr (ρred ln ρred), but for the present purpose it is sufficient to restrict to Slin.
==endquote==

This seems like a mathematically natural way to go. How does it strike you, Finbar? Does this appeal to you, or is perhaps how you already think of quantum states? On a continuum of mixedness?
 
Last edited:
  • #50
marcus said:
I believe that's right, Finbar. And mixed states are probabilistic superpositions of pure states, are they not?
What I'm unsure about is the meaning of "squeezed" states. Do you have a good brief explanation, or a link for that?

A mixed state is not a superposition. Because any superposition of pure states is itself a pure state. A mixed state has an uncertainty beyond that of a quantum mechanical superposition. A mixed state requires a probability distribution over pure states. So one is formally doing quantum statistical physics once you deal with mixed states. One can also think of classical statistical states where we have a probability distribution over classical states.

Not sure what a squeezed state is. Something to do with the saturated uncertainty relation?
 
  • #51
Finbar said:
Not sure what a squeezed state is. Something to do with the saturated uncertainty relation?

No idea! "Squeezed" was a new one on me when I encountered it in the LQG literature recently. What do you think of Kiefer's paper above, and the idea of treating ALL states as density matrices*, just having different degrees of purity (as measured by the trace of their squares)?

*trace class operators, to put it more generally
 
  • #52
The Kiefer Schell paper (as it says in the abstract) is about superpositions gradually becoming indistinguishable from a classical mixture, so it should interest several of us here. there is a nice figure on page 8 which shows the purity factor of a state decline from 1 to zero over the course of "internal time". I'm not sure what internal time means here.

==quote from page 8 of Kiefer Schell==
The iteration at each time step starts with the calculation of s0 and s1, as described above. As a constraint on the numerical evolution, we normalize s0 and s1 such that trρred = 1 is always preserved. Since the initial state is unentangled, trρred2 is initially equal to one. As the inner time variable increases, the total state becomes entangled, and the purity factor decreases—the gravitational variables are in a mixed state, and decoherence becomes more and more efficient. The result can be plotted as a function of the inner time variable φ, see Fig. 1.
==endquote==

Figure 1 is what i was just talking about.
When I say state (in this discussion, since thread concerns the entropy of quantum states) I mean trace class operator on the Hilbert space. IOW loosely speaking "density matrix".
 
  • #53
marcus said:
I'm also used to seeing peaked semiclassical states in LQG, are these also pure?
Not necessary, but the specific peaked states you have seen probably are.

marcus said:
My take on it is that when Kiefer or others start with the U in a pure state and have it progressively decohere, this does not mean that in reality the U would necessarily have to start pure. The analysis just shows how it could start in a purER state and become LESS pure.
The analysis is a fortiori. It is just a convenient simplification to imagine that the system starts in a pure state, the important thing is progressive decoherence starting from whatever level of (im)purity or mixedness. I can imagine you might disagree.
Actually, I agree.
 
  • #55
Demystifier said:
marcus said:
...My take on it is that when Kiefer or others start with the U in a pure state and have it progressively decohere, this does not mean that in reality the U would necessarily have to start pure. The analysis just shows how it could start in a purER state and become LESS pure.
The analysis is a fortiori. It is just a convenient simplification to imagine that the system starts in a pure state, the important thing is progressive decoherence starting from whatever level of (im)purity or mixedness. I can imagine you might disagree.
Actually, I agree.

I'm glad we agree on that! Thanks for the pointer to the "quantum state" article. It's well-written and clears up some confusion on my part. The C* algebra approach seems (a bit abstract but) interesting. This paragraph was helpful (and might be to others besides myself):
==quote==
A pure quantum state is a state which can be described by a single ket vector, as described above. A mixed quantum state is a statistical ensemble of pure states (see quantum statistical mechanics). Equivalently, a mixed-quantum state on a given quantum system described by a Hilbert space H naturally arises as a pure quantum state (called a purification) on a larger bipartite system H ⊗ K, the other half of which is inaccessible to the observer.
A mixed state cannot be described as a ket vector. Instead, it is described by its associated density matrix (or density operator), usually denoted ρ. Note that density matrices can describe both mixed and pure states, treating them on the same footing.
...
A simple criterion for checking whether a density matrix is describing a pure or mixed state is that the trace of ρ2 is equal to 1 if the state is pure, and less than 1 if the state is mixed.[4] Another, equivalent, criterion is that the von Neumann entropy is 0 for a pure state, and strictly positive for a mixed state.
==endquote==

This seems to afford the right context in which to look at the issue of entropy in LQC bounce. I'll bring forward the Kiefer Schell details from a few posts back. It's not about bounce (but about geometry settling into an orientation) nevertheless I think it shows how one might set the problem up.

http://arxiv.org/abs/1210.0418
Interpretation of the triad orientations in loop quantum cosmology
Claus Kiefer, Christian Schell
(Submitted on 1 Oct 2012)
Loop quantum cosmology allows for arbitrary superpositions of the triad variable. We show here how these superpositions can become indistinguishable from a classical mixture by the interaction with fermions. We calculate the reduced density matrix for a locally rotationally symmetric Bianchi I model and show that the purity factor for the triads decreases by decoherence. In this way, the Universe assumes a definite orientation.
12 pages, 1 figure

[As the wikiP that Demy linked points out] purity and mixedness are not absolute properties but are on a range. Maybe all states should be thought of as a density matrix rho and the degree of purity would be the trace of the square of rho.
==quote page 7 Kiefer Schell==
A measure for the purity of the total state (15) is the trace of ρred2, which is equal to one for a pure state and smaller than one for a mixed state; it is directly related to the linear entropy Slin = 1 − ρred2 [5]. One could also discuss the von Neumann entropy −kBtr (ρred ln ρred), but for the present purpose it is sufficient to restrict to Slin.
==endquote==
 
Last edited:
  • #56
Maybe during the (repellent gravity) phase of the bounce all horizons are destroyed and all information becomes accessible to the observer. So the statistical quantum state of the prior classical phase is driven to purity. This could be a way of addressing the issue raised by Finbar.
Finbar said:
...
So maybe I'll do a u-turn and tentatively buy the "bounce" cosmology. We could think that the state of the universe at the bounce is a pure state. Then the universe evolves to the current day at which point each observer can only see a finite amount of the universe which will be described as a mixed state. Finally the universe then collapses at which point all the universe comes back together and a pure state is again recovered. Now the issue is why the final pure state would look anything like the initial pure state.

Just a note: Commonly in LQC modeling they include Lambda and there is no re-collapse--our classical phase just keeps expanding in the future. They do also study repeated bounce models (zero Lambda) but it isn't necessary. When you run the model back in time it bounces and expands (IOW you see a collapsing prior classical U). So we can rephrase your puzzle and it is just as puzzling put this way---with a single bounce.

I guess one still has to wonder what sort of thing that could be considered an observer could survive through a bounce, and maintain its integrity/identity. But let's set that question aside and assume everything is well-defined. The puzzle that won't go away is how a mixed state in the prior collapsing phase (where lots of information starts out being inaccessible to the observer) can become pure.

"In a moment, in the twinkling of an eye..." :biggrin:
 
Last edited:
  • #57
On reflection, I've concluded that the way people are going understand these issues will likely be to go back to the June 1994 Connes-Rovelli paper.
http://arxiv.org/abs/gr-qc/9406019
Von Neumann Algebra Automorphisms and Time-Thermodynamics Relation in General Covariant Quantum Theories
A. Connes, C. Rovelli
(Submitted on 14 Jun 1994)
We consider the cluster of problems raised by the relation between the notion of time, gravitational theory, quantum theory and thermodynamics; in particular, we address the problem of relating the "timelessness" of the hypothetical fundamental general covariant quantum field theory with the "evidence" of the flow of time. By using the algebraic formulation of quantum theory, we propose a unifying perspective on these problems, based on the hypothesis that in a generally covariant quantum theory the physical time-flow is not a universal property of the mechanical theory, but rather it is determined by the thermodynamical state of the system ("thermal time hypothesis"). We implement this hypothesis by using a key structural property of von Neumann algebras: the Tomita-Takesaki theorem, which allows to derive a time-flow, namely a one-parameter group of automorphisms of the observable algebra, from a generic thermal physical state. We study this time-flow, its classical limit, and we relate it to various characteristic theoretical facts, as the Unruh temperature and the Hawking radiation. We also point out the existence of a state-independent notion of "time", given by the canonical one-parameter subgroup of outer automorphisms provided by the Cocycle Radon-Nikodym theorem.
25 pages

A unified framework for spacetime geometry, quantum theory, and thermodynamics seems to be needed. The vN-algebra approach seems to provide it. I had the luck to be exposed to C*-algebras in grad school around the time Dixmier's book first came out in English (1977). I guess we should say "von Neumann algebra". How is Loop gravity going to be rebuilt in vN-algebra terms? All states of the universe are mixed, with different degrees of purity. What equation drives the state to high levels of purity at or around the bounce?

It must have to do with the dissipation of horizons. They must shrink to nothing or burst, during collapse to the extreme density. How does one formulate the concept of horizon in the vN-algebra setting? The "purification" bipartite factorization of the hilbertspace might be used: H⊗K, described in the article Demy pointed to, with K standing for information "inaccessible to the observer".
 
Last edited:

Similar threads

Replies
20
Views
2K
Replies
13
Views
3K
Replies
14
Views
4K
Replies
10
Views
1K
Replies
69
Views
5K
Replies
4
Views
2K
Replies
7
Views
2K
Back
Top