What were the major QG advancements in 2007?

  • Thread starter marcus
  • Start date
  • Tags
    Paper
In summary, the conversation was about the QG paper of the year for 2007. There were two clear choices: Marseille and Utrecht. Rovelli's team in Marseille made significant progress by fulfilling the initial spinfoam program goals, handling the Lorentz case, getting the Immirzi parameter to arise in spinfoam, and making spinfoam compatible with canonical LQG. On the other hand, Loll's team in Utrecht made advancements in getting small quantum universes with matter to emerge in computer simulations. They also found evidence for a fractal spacetime foam on Planckian distance

Which will have the greatest impact on future QG research?

  • (blue) the Marseille paper on a new spinfoam vertex

    Votes: 0 0.0%

  • Total voters
    6
  • #1
marcus
Science Advisor
Gold Member
Dearly Missed
24,775
792
QG Paper of the Year---2007

In the poll please ignore the "(blue)" and "(green)" that got in by mistake. I tried but was unable to edit them out.

To me, for QG paper of the year, looks like a clear choice between Marseille and Utrecht. Both have made major progress this year, which can be summed up by pointing to one by Rovelli's team and two by Loll's.
I've color-coded for identification.

[EDIT in the poll choices, I stated the colors wrong at first and said blue/green by mistake. I can't edit the poll questions, so please ignore the error.]

Rovelli's (red) Marseille team essentially fullfilled the initial spinfoam program goals.
http://arxiv.org/abs/0711.0146 (8 pages)
LQG vertex with finite Immirzi parameter
1. they handled the Lorentz case (not just the simpler Euclidian case)
2. they got the Immirzi parameter to arise in spinfoam, which it hadn't till now
3. they got discrete area spectrum like in LQG
4. they got spinfoam to be compatible with canonical LQG.
There's still more to do, as usual! But the year saw a dramatic advance on this front.

Here's their abstract: "We extend the definition of the 'flipped' loop-quantum-gravity vertex to the case of a finite Immirzi parameter. We cover the Euclidean as well as the Lorentzian case. We show that the resulting dynamics is defined on a Hilbert space isomorphic to the one of loop quantum gravity, and that the area operator has the same discrete spectrum as in loop quantum gravity. This includes the correct dependence on the Immirzi parameter, and, remarkably, holds in the Lorentzian case as well. The ad hoc flip of the symplectic structure that was initially required to derive the flipped vertex is not anymore needed for finite Immirzi parameter. These results establish a bridge between canonical loop quantum gravity and the spinfoam formalism in four dimensions."

Loll's (blue) Utrecht team is getting small quantum universes, with matter, to emerge in computer QG simulations. They keep finding more out about these universes taking shape in the computer. It is pretty remarkable and I'd better just let you look at their abstracts and the papers themselves.

http://arxiv.org/abs/0712.2485 (10 pages)
Planckian Birth of the Quantum de Sitter Universe
http://arxiv.org/abs/0711.0273 (21 pages)
The Emergence of Spacetime, or, Quantum Gravity on Your Desktop
"We show that the quantum universe emerging from a nonperturbative, Lorentzian sum-over-geometries can be described with high accuracy by a four-dimensional de Sitter spacetime. By a scaling analysis involving Newton's constant, we establish that the linear size of the quantum universes under study is in between 17 and 28 Planck lengths. Somewhat surprisingly, the measured quantum fluctuations around the de Sitter universe in this regime are to good approximation still describable semiclassically. The numerical evidence presented comes from a regularization of quantum gravity in terms of causal dynamical triangulations."

"Is there an approach to quantum gravity which is conceptually simple, relies on very few fundamental physical principles and ingredients, emphasizes geometric (as opposed to algebraic) properties, comes with a definite numerical approximation scheme, and produces robust results, which go beyond showing mere internal consistency of the formalism? The answer is a resounding yes: it is the attempt to construct a nonperturbative theory of quantum gravity, valid on all scales, with the technique of so-called Causal Dynamical Triangulations. Despite its conceptual simplicity, the results obtained up to now are far from trivial. Most remarkable at this stage is perhaps the fully dynamical emergence of a classical background (and solution to the Einstein equations) from a nonperturbative sum over geometries, without putting in any preferred geometric background at the outset. In addition, there is concrete evidence for the presence of a fractal spacetime foam on Planckian distance scales. The availability of a computational framework provides built-in reality checks of the approach, whose importance can hardly be overestimated."
 
Last edited:
Physics news on Phys.org
  • #2
Well so far FOUR have voted, other is leading and it's an even race between Utrecht and Marseille.
Actually I suspect that "GraviTEA" is from Utrecht and was just being nice to the Marseille side by voting for them.

I should highlight some things from the Utrecht papers to show why I think they stand out.

"Most remarkable at this stage is perhaps the fully dynamical emergence of a classical background (and solution to the Einstein equations) from a nonperturbative sum over geometries, without putting in any preferred geometric background at the outset."

1. they don't invent new hardware----no strings branes loops extradimensions or other junk

2. it's NOT perturbative----nonperturbative means no "gravitons"---they don't cheat by starting with a solution and making slight variations around it

3. its a path integral, Feynman style----add up a regularized sum of all the ways space can evolve from shape A to shape B----weighted according to Einstein Hilbert Regge idea of how space interacts with itself as it evolves: the path integral is a weighted average of all geometric trajectories.

4. nothing about spacetime is determined in advance---not even its dimension.
all you give is rules for how it interacts with itself (and matter when that is present) at arbitrarily small scale------micromicroscopic quantum dynamics.

You don't necessarily even GET a whole number for the dimension at very small scale.
Dimensionality is a scale-dependent quantum observable--which doesn't even need to be integer.

5. Smoothness is only an illusion appearing at large scale.

6. Spacetime is autonomous, each time it arises in the computer it is different.

7. There is no minimum distance. At each stage you fix a triangle size a. Which in theory goes to zero. You can make it as small as you want by putting more triangle building blocks into the computer memory-----the only limits on how small are computing power, CPU time, memory.

8. they succeeded in CALCULATING what the Planck length is, in terms of their triangle size a. They calculate what Newton G is, in terms of that. They figure out what size cosmo-constant Lambda has to be, for things to work.

With their present computer resources they run sims with up to 1/3 of a million blocks which means the universes that pop into simulated existence have radius like 20 Planck lengths. And surprisingly that is already big enough for them to show classical behavior--approximate smoothness, predictable quantum fluctuations around the expected shape.
 
Last edited:
  • #3
Dear Marcus,

What about Edward Witten's paper? Or are string theorists writing stringy papers or string inspired gravity ideas (i.e AdS/CFT) excluded at the outset? Anything Ed writes usually gets plenty of citations, hence influence in future research.

Authors: Edward Witten. (Submitted on 22 Jun 2007). Abstract: We consider the problem of identifying the CFT's that may be dual to pure gravity in three ...
arxiv.org/abs/0706.3359

marcus said:
Well so far FOUR have voted, other is leading and it's an even race between Utrecht and Marseille.
Actually I suspect that "GraviTEA" is from Utrecht and was just being nice to the Marseille side by voting for them.

I should highlight some things from the Utrecht papers to show why I think they stand out.

"Most remarkable at this stage is perhaps the fully dynamical emergence of a classical background (and solution to the Einstein equations) from a nonperturbative sum over geometries, without putting in any preferred geometric background at the outset."

1. they don't invent new hardware----no strings branes loops extradimensions or other junk

2. it's NOT perturbative----nonperturbative means no "gravitons"---they don't cheat by starting with a solution and making slight variations around it

3. its a path integral, Feynman style----add up a regularized sum of all the ways space can evolve from shape A to shape B----weighted according to Einstein Hilbert Regge idea of how space interacts with itself as it evolves: the path integral is a weighted average of all geometric trajectories.

4. nothing about spacetime is determined in advance---not even its dimension.
all you give is rules for how it interacts with itself (and matter when that is present) at arbitrarily small scale------micromicroscopic quantum dynamics.

You don't necessarily even GET a whole number for the dimension at very small scale.
Dimensionality is a scale-dependent quantum observable--which doesn't even need to be integer.

5. Smoothness is only an illusion appearing at large scale.

6. Spacetime is autonomous, each time it arises in the computer it is different.

7. There is no minimum distance. At each stage you fix a triangle size a. Which in theory goes to zero. You can make it as small as you want by putting more triangle building blocks into the computer memory-----the only limits on how small are computing power, CPU time, memory.

8. they succeeded in CALCULATING what the Planck length is, in terms of their triangle size a. They calculate what Newton G is, in terms of that. They figure out what size cosmo-constant Lambda has to be, for things to work.

With their present computer resources they run sims with up to 1/3 of a million blocks which means the universes that pop into simulated existence have radius like 20 Planck lengths. And surprisingly that is already big enough for them to show classical behavior--approximate smoothness, predictable quantum fluctuations around the expected shape.
 
  • #4
Last edited:

FAQ: What were the major QG advancements in 2007?

What is the "QG Paper of the Year-2007" about?

The QG Paper of the Year-2007 is a research paper that was published in 2007 and is considered to be one of the most significant and influential papers in the field of quantum gravity (QG). It addresses the problem of reconciling Einstein's theory of general relativity with quantum mechanics.

Who wrote the "QG Paper of the Year-2007"?

The QG Paper of the Year-2007 was written by Dr. Abhay Ashtekar, a theoretical physicist and professor at Pennsylvania State University, along with his colleagues Dr. Alejandro Corichi and Dr. Parampreet Singh.

What makes the "QG Paper of the Year-2007" stand out?

The QG Paper of the Year-2007 is known for introducing the concept of loop quantum cosmology, which combines the principles of loop quantum gravity and cosmology. This paper has been highly influential in the development of loop quantum gravity and has inspired further research in the field.

How did the "QG Paper of the Year-2007" impact the field of quantum gravity?

The "QG Paper of the Year-2007" has had a significant impact on the field of quantum gravity, particularly in the area of loop quantum gravity. It opened up new avenues for research and has been cited numerous times in subsequent papers. It also sparked debates and discussions among scientists, leading to further advancements in the field.

Has the "QG Paper of the Year-2007" stood the test of time?

Yes, the "QG Paper of the Year-2007" has stood the test of time and is still considered to be a groundbreaking paper in the field of quantum gravity. It has been cited over 1000 times and continues to be referenced in current research. The concepts and ideas introduced in this paper are still relevant and continue to be explored by scientists today.

Back
Top