- #36
marcus
Science Advisor
Gold Member
Dearly Missed
- 24,775
- 792
I'm back, the stew is simmering in the oven. But it is too late to continue the other post so I'll start a new one.
=============
My attitude about the Utrecht model quantum universe is that it is one example out of a handful of quantum geometry models---one which has reached the break-out stage.
One where they have reached the stage of running computer simulations of the universe and having spacetime emerge as an epiphenomenon.
the appearance of classic smoothness (satisfying Einstein eqn.) arising from microscopic quantum roughness and confusion
spacetime as a path integral---average of all different crazy ways of getting from this spatial geometry to that. maybe 4D spacetime doesn't exist, maybe it is always just a path from this space-state to that one.
So my attitude is to watch for other approaches, like spinfoams, to get to a stage where they have a path integral and where they can run their models in computer and get smooth classic geometry to emerge as a largescale average.
And I want to see if there are OTHER approaches that, when they get to this stage, also predict fractal microstructure and a decline in dimensionality at small scale.
there have been hints of this kind of development. Freidel just posted a paper obtaining a path-integral for spinfoams (making the spinfoams approach look more like the Utrecht model---evolutionary convergence) and two papers were delivered about this at the international QG conference last week in UK.
Martin Reuter at University of Mainz has an approach which actually preceded the Utrecht people in finding hints of spacetime dimension around 2 down at very small scale, hints of microscopic fractal structure. His is yet a different approach (not spinfoams, not simplex) which in other respects is perhaps not as satisfactory, but at least confirms this point coming from a different direction.
These are just straws in the wind.
If I understand you mean an evolving speed of light. In their model they do have a shape parameter which is the ratio of the timelike edges to the spacelike edges, of the simplex. the simplex does not need to be equilateral. All simplexes are identical. they have sometimes played around with this shape ratio parameter, but it stays constant for all simplexes for the duration of the computer run. as far as I know it has always been treated as a constant. I don't know the very latest.
It doesn't matter what my hunch is, of course, because I could so easily be wrong. My hunch is that a variable speed of light will never come out of the Utrecht model. the model is too simple. As the SciAm article says, it is hard to imagine any more minimalist way to treat quantum gravity. minimal paraphernalia, minimal assumptions, simple rules.
it might be for some other computer modeling approach farther down the road to try a variable speed of light, not this approach----but I could easily be mistaken.
oldman said:...
In suggesting that the number of dimensions may change with scale, from 2 to 4, the authors seem to imply that some 'dimensions' emerge? unfold? or are born? out of quantum chaos...
=============
My attitude about the Utrecht model quantum universe is that it is one example out of a handful of quantum geometry models---one which has reached the break-out stage.
One where they have reached the stage of running computer simulations of the universe and having spacetime emerge as an epiphenomenon.
the appearance of classic smoothness (satisfying Einstein eqn.) arising from microscopic quantum roughness and confusion
spacetime as a path integral---average of all different crazy ways of getting from this spatial geometry to that. maybe 4D spacetime doesn't exist, maybe it is always just a path from this space-state to that one.
So my attitude is to watch for other approaches, like spinfoams, to get to a stage where they have a path integral and where they can run their models in computer and get smooth classic geometry to emerge as a largescale average.
And I want to see if there are OTHER approaches that, when they get to this stage, also predict fractal microstructure and a decline in dimensionality at small scale.
there have been hints of this kind of development. Freidel just posted a paper obtaining a path-integral for spinfoams (making the spinfoams approach look more like the Utrecht model---evolutionary convergence) and two papers were delivered about this at the international QG conference last week in UK.
Martin Reuter at University of Mainz has an approach which actually preceded the Utrecht people in finding hints of spacetime dimension around 2 down at very small scale, hints of microscopic fractal structure. His is yet a different approach (not spinfoams, not simplex) which in other respects is perhaps not as satisfactory, but at least confirms this point coming from a different direction.
These are just straws in the wind.
In suggesting that the number of dimensions may change with scale, from 2 to 4, the authors seem to imply that some 'dimensions' emerge? unfold? or are born? out of quantum chaos. If so, it seems possible that the relation between our time and space dimensions, as measured by the ratio of their GR metric coefficients, could change. I've often wondered if this ratio is eternal. If it were to change...
If I understand you mean an evolving speed of light. In their model they do have a shape parameter which is the ratio of the timelike edges to the spacelike edges, of the simplex. the simplex does not need to be equilateral. All simplexes are identical. they have sometimes played around with this shape ratio parameter, but it stays constant for all simplexes for the duration of the computer run. as far as I know it has always been treated as a constant. I don't know the very latest.
It doesn't matter what my hunch is, of course, because I could so easily be wrong. My hunch is that a variable speed of light will never come out of the Utrecht model. the model is too simple. As the SciAm article says, it is hard to imagine any more minimalist way to treat quantum gravity. minimal paraphernalia, minimal assumptions, simple rules.
it might be for some other computer modeling approach farther down the road to try a variable speed of light, not this approach----but I could easily be mistaken.
Last edited: