Maximum value of Hubble parameter (near start of expansion)

In summary: It seems that the force is due to the quantum geometry corrections, which create a "repulsive" force. This force is negligible under normal conditions, but becomes dominant as the curvature approaches the Planck scale. This would halt the collapse that would classically lead to a singularity.
  • #1
marcus
Science Advisor
Gold Member
Dearly Missed
24,775
792
An estimate of 0.94/tPlanck for the maximum reached by the Hubble parameter was given by Ashtekar and Sloan here (based on Loop qc):

http://arxiv.org/abs/0912.4093
Loop quantum cosmology and slow roll inflation
Abhay Ashtekar, David Sloan
(Submitted on 21 Dec 2009)
"In loop quantum cosmology (LQC) the big bang is replaced by a quantum bounce which is followed by a robust phase of super-inflation. Rather than growing unboundedly in the past, the Hubble parameter vanishes at the bounce and attains a finite universal maximum at the end of super-inflation. These novel features lead to an unforeseen implication: in presence of suitable potentials all LQC dynamical trajectories are funneled to conditions which virtually guarantee slow roll inflation with more than 68 e-foldings, {without any input from the pre-big bang regime}. This is in striking contrast to certain results in general relativity, where it is argued that the a priori probability of obtaining a slow roll with 68 or more e-foldings is suppressed by a factor e-204."

See also page 12 of a short review written for cosmologists, which includes the above results along with others:
http://arxiv.org/abs/1005.5491
The Big Bang and the Quantum

The finite universal maximum H paramater is estimated to be between 60 and 61 orders of magnitude greater than today's value.

This is not the "initial" value of H. According to the prevailing Lqc model the value of H is zero at the precise start of expansion.

It quickly increases however, due to quantum corrections that make gravity repellent at near-Planckian density, and reaches its maximum in a few tPlanck. Thereafter it declines.
 
Last edited:
Space news on Phys.org
  • #2
With quantum corrections (depending on density) the Friedmann equation becomes

(a'/a)2 = H2 = (8πG/3)ρ(1 - ρ/ρcrit)Where the critical density is ρcrit = sqrt(3)/(32π2γ3) times the usual Planck density, which is c5/(hbar G2)

This number sqrt(3)/(32π2γ3) turns out to be 0.41 if you plug in a commonly used figure of γ = 0.237 for the Immirzi parameter.

So the critical density, which is reached right at the end of contraction and beginning of expansion, is commonly given as 0.41 ρPlanck, in other words 41 percent of Planck density.

The maximum value attained by the Hubble parameter can be solved for analytically and is, in Planck terms:
1/(2γ sqrt(8πγsqrt(3)/2))

Plugging in a conventional value for the Immirzi gives
1/(2*.237*sqrt(8*pi*.237*sqrt(3)/2)) or approximately 0.93.
Ashtekar's paper gives 0.94 which simply means he used more decimal places in writing 0.236... and I used the rounded 0.237.
αβγδεζηθικλμνξοπρσςτυφχψωΓΔΘΛΞΠΣΦΨΩ∏∑∫∂√±←↓→↑↔~≈≠≡ ≤≥½∞(⇐⇑⇒⇓⇔∴∃ℝℤℕℂ⋅)
 
Last edited:
  • #3
It's natural to ask why cosmologists are interested in non-singular early universe models and, in particular, the Lqc bounce model.
The answer seems to be that it has observable consequences that one can test.

To find out more about this look up papers by phenomenologists like Julien Grain, Wen Zhao, Aurelien Barrau, Jakub Mielczarek. The key to effective testing would be a spacecraft mapping CMB polarization somewhat more precisely than is possible with the ESA's currently operating Planck instrument.

I was curious about the dependence of the Hubble maximum (and the maximum density) on which value is chosen for the Immirzi parameter. There are two in use.
Ashtekar's group uses 0.237 and Rovelli's group uses 0.274.

So Penn State cosmology papers give the max density as 41% of Planck while Marseille cosmology papers just leave things expressed in terms of gamma---but if they plugged in their preferred value they would get a max density of 27%. That is, 0.2666 rounded to two places.

I gave the formula for the maximum or critical density at the bounce earlier, in terms of the usual Planck density it is sqrt(3)/(32 π2 γ3). You just have to plug in for gamma and you get either .41 or .27.

The Planck unit for the Hubble parameter is 1/tP the reciprocal of Planck time.

The formula for the maximum Hubble attained, expressed in terms of this Planck unit, is

1/(4 γ1.5 sqrt(π sqrt(3))

Again, plugging in γ = 0.274 which the Marseille people like, one gets 0.75

If you happen to use the google calculator that means typing this into the search window:
1/(4*.274^1.5*sqrt(pi*sqrt(3))

If anyone is interested in calculating or knowing the current Hubble time in Planck units, you can get this easily with the google calculator. Just put this in the window:
sqrt(c^5/(hbar*G))/(71 km/s per Mpc)
The calculator will give you 8.06 x 1060, which is the Hubble time (the reciprocal of the Hubble parameter) expressed in Planck time units.
 
Last edited:
  • #4
marcus said:
Loop quantum cosmology and slow roll inflation
Abhay Ashtekar, David Sloan
(Submitted on 21 Dec 2009).

One key bit that did not seem to be explained within the paper (nor any references given) was the nature of the quantum geometry corrections.

the non-perturbative quantum geometry corrections create a ‘repulsive’ force. While this force is negligible under normal conditions, it dominates when curvature approaches the Planck scale and can halt the collapse that would classically have led to a singularity.

Can you explain in simple terms what kinds of corrections these are, and how they would create a repulsive force?
 
  • #5
Still trying to track down a simple explanation of these corrections in LQG. It sounds like they are a result of holonomy~flux being employed as the basic conjugate pair - an area version of the more usual location~momentum pairing? So the extra dimensionality giving different answers on what happens on entering the quantum scale?

Still no idea why this should turn repulsive at the scale of the bounce.
 
  • #6
apeiron said:
...Still no idea why this should turn repulsive at the scale of the bounce.

I just saw your post. I've several vague-ish ideas about that. One is to recall the HUP that matter doesn't like to be pinned down. If you try to nail the position then the momentum goes wild. At a certain point (quantum) geometry resists definition too. It would seem.

Another thought is that in Loop Gravity the area operator turns out to have discrete spectrum. there is a minimum nonzero area. something weird happens if you try to make area too small.
This isn't really kosher but the dimension of curvature IS after all reciprocal area. that suggests something weird happens if you try to make curvature too big.

these are unofficial intuitions and do not reflect the views of Ashtekar and Bojowald.

The bounce was discovered by Bojowald in 2001. He was then a postdoc, had just gotten his PhD. You could look up the orginal 2001 paper on arxiv. "The absence of a singularity..."
Until 2006 Bojowald was the main Loop Cosmology guy.

Then Ashtekar revised Loop Cosmology in 2006 with what he and co-authors called "improved dynamics."
Probably you have been looking at some Ashtekar papers from 2006-2010. I would like to know which pedagogical review by him is actually the best. He has written several.
Do you have an opinion?

Anything work especially well for you, or better than something else?
 
  • #7
marcus said:
I just saw your post. I've several vague-ish ideas about that. One is to recall the HUP that matter doesn't like to be pinned down. If you try to nail the position then the momentum goes wild. At a certain point (quantum) geometry resists definition too. It would seem.

That would be a generic effect for any QM story, and not specific to the area-based operators of LQG I would have thought?

(And it is what I mean when I say the QM scale is vague - push hard in one direction and you find yourself instead going in the orthogonal one o:))

Another thought is that in Loop Gravity the area operator turns out to have discrete spectrum. there is a minimum nonzero area. something weird happens if you try to make area too small.
This isn't really kosher but the dimension of curvature IS after all reciprocal area. that suggests something weird happens if you try to make curvature too big.

So pushing towards and through the Planck scale warps space so much that the little bits of space can no longer connect up. But this does not seem "weird". Shrinking the area would increase the energy density/curvature and so enforce that breaking up. This would seem to be just re-stating HUP limits, rather than being something puzzling. Am I missing something here?

Probably you have been looking at some Ashtekar papers from 2006-2010. I would like to know which pedagogical review by him is actually the best. He has written several. Do you have an opinion?

The one you linked to - The Big Bang and the Quantum - was impressively simple. But then as I say, it suddenly failed to explain or source these geometric corrections. So I am still looking for explanations.

There is this on p4 where Bojowald lists them...

Quantum geometry and quantum dynamics at the Planck scale
Martin Bojowald
http://arxiv.org/PS_cache/arxiv/pdf/0910/0910.2936v1.pdf

Three types of corrections, in general equally important:
- Entire states evolve which spread and deform. Quantum fluctuations, correlations and higher moments are independent variables back-reacting on expectation values.
- In loop quantum gravity, holonomies he(A) = P exp(Re Aiaτidt) as non-local, non-linear functions imply higher order corrections.
−In loop quantum gravity, fluxes quantizing metric have discrete spectra containing zero. Inverse metric components receive corrections for discrete (lattice-like) states with small
elementary areas

I'm still trying to make sense of what it mean. It seems to say you have the standard back-reactions plus the holonomy~flux ones. The latter are perhaps extra because they arise from a pre-geometrical state of nature - a sea of discrete area fluctuations rather than the usual fluctuations of a continuous energy field.
 
  • #8
apeiron said:
The one you linked to - The Big Bang and the Quantum - was impressively simple. But then as I say, it suddenly failed to explain or source these geometric corrections. So I am still looking for explanations.
...

I think I owe you one. Your post #7 set me a good example and I went back and printed off that article so I can spend some quiet time with it. It is well and clearly written. It turns out I had only skimmed parts before. As you suggest, it only gives the easy parts of derivations, with the harder parts the results just get quoted and a reference given.

To be sure I'm gettting consistent story I'm inclined to stick with the writings of Ashtekar and his collaborators (Corichi, Singh, Sloan, Wilson-Ewing, Henderson...) Bojowald's merit notwithstanding I'm apprehensive that if I read his explanations it will introduce contradictions into my head. So for the time being I will not follow up on the Bojo reference you gave.

This "BB and Q" paper really is very good! I'll give the link again in case anyone else is reading the thread.
http://arxiv.org/abs/1005.5491
 
  • #9
Working through these two papers is giving more of the necessary grounding...

QUANTUM NATURE OF THE BIG BANG IN LOOP QUANTUM COSMOLOGY ¤
Abhay Ashtekar
http://igpg.gravity.psu.edu/people/Ashtekar/articles/solvaynet.pdf

and then providing the supporting details of the numerical simulation...

Quantum Nature of the Big Bang
Abhay Ashtekar,∗ Tomasz Pawlowski,† and Parampreet Singh‡
http://arxiv.org/PS_cache/gr-qc/pdf/0602/0602086v2.pdf

My impression so far is that the bounce is just an artifact of the model.

At the Planck scale, we would expect to see an unoriented quantum foam. This would be a state without either positive or negative gravity (attractive or repulsive) because there would be no "connected up geometry" possible at this thermal scale. The sign of gravity would be vague (totally symmetric in terms of its "direction").

The bounce cosmology instead seeks to avoid this fate by retaining a memory of connectedness (the holonomy operator that is preserved as an effective constraint and passes unaffected through everything to bridge the universe from a crunch to a bang).

So it seems that gravity becomes repulsive because it has (briefly) an unoriented strength, yet there still exists - according to the model - a holonomy operator to reorientate it back towards a now reverse classical fate.

To me, this is "getting rid of the singularity" much like we might describe the trajectory of a ball tossed in the air. It decelerates to a halt then accelerates back down. There is still a singularity in fact - that instant where the ball is halted and could now head off in any direction surely? But whoops, there was still in fact a constraint on the ball that bridged it across the tricky moment - gravity was orienting the motion of the ball all along. :smile:

I think the general machinery of the loop approach is attractive. But this bounce cosmology appears to be an artifact of the model so far. I preferred it when loop approaches seemed to believe that the quantum foam was the ultimate ground - the naked potential out of which a crisp dimensionality self-organised.

In the quantum foam view, the holonomy operator would be part of what self-organises, not something that floats above the fray in a way that it can pass through the Planck scale unaffected.

In particular here, I am focusing on the statement in the first ref on p3...

In this representation, quantum states depend on gravitational spin-connections and matter fields, but the dependence on connections is only through holonomies h. Hence there are well-defined holonomy operators ^h but it turns out that there is no operator A^ corresponding to the connection itself.

So h gets preserved through the bounce, while A is allowed to flip into a generally reversed state (rather than dissolving into the quantum foam as would be the case if there were no h to act as the structure-preserving, top-down acting, context).

Or as Ashtekar puts it...

Thus, in the Planck regime, although there are significant quantum fluctuations, the state has retained the `memory' that it came from a semi-classical state. We do not have a quantum foam on the other side. Rather, there is a quantum bounce. Thus, quantum geometry in the Planck regime serves as a bridge between two large classical universes.
 
  • #10
apeiron said:
...
My impression so far is that the bounce is just an artifact of the model.
...

It is certainly a robust result of the model.

Perhaps by "artifact" you mean a genuine mathematical result. Then I would agree. The bounce is remarkably robust in the sense that you keep getting it no matter what version of Loop Gravity you work with, or what parameters you plug in.

You get it using spinfoam (instead of the usual LQC Hamiltonian dynamics). You get it even with the assumption of isotropy and homogeneity relaxed. You get it using solvable equation models that approximate the primary model. You get it when you run the model by crunching numbers in computer. You get it in different cases when you vary the parameters.

That's what "robust" means in this context, in case anyone is reading this who is unfamiliar with that use of the word. It's used a lot to describe Loop's bounce result.

So bounce is a robust prediction of Loop Gravity and one that has an expected signature in the CMB so that it makes Loop falsifiable.

Many of the topcited Loop papers from 2009 and 2010 are phenomenology papers about this. You can check Spires:
"dk quantum cosmology, loop space and date=2010" selecting order by cite count.

Ap, perhaps you would allow that this is what a theory in a mathematical science like cosmo is supposed to do. It is supposed to produce an experimentally checkable result. Ideally the result should be robust, so that if it is not found there is a substantial loss of credibility.

You can, of course, refer to the result as just an "artifact". Theories are artificial constructs so that seems etymologically correct---any prediction derived from a theory (esp. if it seems unavoidable, and appeared like this one in 2001 as a surprise) must be an "artifact".

Heh heh, it just sounds a little silly to call it that. :biggrin:
 
Last edited:
  • #11
I think the bounce cannot be traced to holonomy operators, not in any simple way.
You have been using a paper by Bojowald, write-up of a talk he gave in 2008? or 2009?
It considers a particular formulation and in that specific context he traces the bounce to
holonomies. But there are formulations without holonomies which exhibit the b.
E.g. a 2010 paper by Battisti and Marciano. I have to turn in, will think about this tomorrow.
 
  • #12
marcus said:
You can, of course, refer to the result as just an "artifact". Theories are artificial constructs so that seems etymologically correct---any prediction derived from a theory (esp. if it seems unavoidable, and appeared like this one in 2001 as a surprise) must be an "artifact".

Heh heh, it just sounds a little silly to call it that. :biggrin:

At one stage, even LQCers were admitting it was an open question. Are you saying the question is now closed?

http://www.matmor.unam.mx/~corichi/PRL-Recall.pdf

The detailed study of isotropic solutions in a symmetry
reduced quantum theory of gravity known as loop
quantum cosmology (LQC) [2] has shown that for a universe
filled with a massless scalar field, with or without a
cosmological constant, the singularity gets replaced by a
quantum bounce [3]. Whether singularity resolution is an
artifact of the symmetries or a generic feature of quantum
cosmology is an open question.
 
  • #13
You said artifact of the model. You mentioned features of the model like holonomies as if the mathematical tools somehow caused the bounce :biggrin:

Corich Singh said "symmetries". There is a subtle difference. Symmetries are simplifying assumptions that one can gradually relax and hopes to dispense with altogether as one generalizes the results.

The Corichi Singh paper (actually posted in 2007) is:
http://arxiv.org/abs/0710.4543
It made explicitly clear that it was talking about a version of LQC which is symmetry-reduced. By assuming the same things as the Friedmann model (homog and isotropy) the many degrees of freedom are reduced to a few. In the Friedmann model the whole universe boils down to a couple of numbers like the scale factor and the density. This used to be the case in LQG.

LQC is no longer restricted to that perfectly symmetric uniform universe!

If you will take the trouble to read the post-2008 literature you will see that string homogeneity and isotropy have been relaxed in many LQC papers. Do you want links?

The opposite of being limited by special assumptions is robust. It is a matter of degree--there can still be room for improvement.
Already the bounce is seen to be quite a robust result of LQC. And further removal of conditions, if successful, will make it more robust. I mentioned this before and gave examples of how it persisted when assumptions/parameters were varied. And also persists whether one uses analytical or numerical methods, or spinfoam models with their different mathematical formulation.

Maybe material in my previous posts on this thread suffices, logically, to qualify what was said in that instance in 2008?
 
Last edited:
  • #14
marcus said:
If you will take the trouble to read the post-2008 literature you will see that string homogeneity and isotropy have been relaxed in many LQC papers. Do you want links?

The opposite of being limited by special assumptions is robust. It is a matter of degree--there can still be room for improvement.
Already the bounce is seen to be quite a robust result of LQC. And further removal of conditions, if successful, will make it more robust.

Yes, I've skimmed the more recent papers, and it can become like the shell game. A blur of action so you can't keep track of where they might be hiding the artifacts this time :cool:.

Whether it is the holonomies, symmetries or indeed the massless scalar field that acts as a clock for the "background independent approach", LQC seems to have lots of places it could be concealing artifacts. That is smuggling in the inputs that are then celebrated as the outputs.

Bounce LQC could end up being the kosher ToE. An eternally bouncing cosmology might prove to be the truth of reality, no matter how unappealing that seems now. For this reason, I am trying to follow the logic of what is being claimed.

I was just rather taken aback that when I asked point-blank about what carries the equations through the "fly pass" of the near-planck scale without the information being scrambled, and what produced the repulsive gravity that kicks the equations out the other side, you could not put a finger on the mechanisms.

I am sure there are some simple explanations of the motivating ideas that will remove the sense of artifactual results.

For example, Bojowald says the singularity is avoided and the bounce enforced because discrete space provides limited energy storage.

But I can see a situation where the equations work up to the point that a volume operator still operates, yet the bounce is simply an artifact of pushing the operator into a realm where instead of asymmetrically breaking down as it would naturally (become one with the quantum foam), the time symmetry built into the equations allows you to smoothly reverse out the other side.

It is just the same as any mechanics which is locally reversible (due to the symmetry built into the equations) yet nature proves to be globally irreversible (so necessitating a supplementary body of global laws, like the statistical mechanics underpinning thermodynamics).

The bounce would seem to have the same status as time travel in GR. The equations may allow something, and so even seem to demand it. But it is indeed an artifact of the local symmetries embedded in the equations. That does not stop GR being a terrifically useful model, and QG may be as well. But I would be very cautious about the status of a generic result of a bounce cosmology (even though as yet I cannot spot where the artifacts are hidden in the current blur of a shell game).

Rather than being irritated, why not respond to this as an honest question? What actually is being claimed about the nature of the memory that carries things through the bounce and the repulsive energy that kicks them out the other side?

Is it still hand-waving (we don't actually have something to fill the blanks yet, this is what we are working towards identifying)? Or has LQC instead settled on a consensus ontology here?
 
  • #15
apeiron said:
I was just rather taken aback that when I asked point-blank about what carries the equations through the "fly pass" of the near-planck scale without the information being scrambled, and what produced the repulsive gravity that kicks the equations out the other side, you could not put a finger on the mechanisms.

Part of your post is a perfectly fair criticism of me. I cannot point to a single "mechanism" in the theory that causes the bounce. There are several equivalent versions of the theory, as I said, using different machinery. I think we are apt to fool ourselves if we look at just one cog in just one of the machines

What I did reply with was my suspicion that something like the HUP (Heisenberg principle) applies to energy density. There is this max density estimated at 41% of Planck. Just a vague notion.

Incidentally that was one of the new things that came in with Ashtekar 2006 "revolution."

Before with the Bojo 2001-2005 version the bounce could occur at various different densities depending on other factors. A serious fault was found with that Bojo version of LQC. I forget what it was.

Anyway I freely admit to not understanding why there is this robust result. They vary all sorts of things including going all the way to spinfoam models (a big jump) and they still get bounce. They vary cosmological parameters (finite space, infinite space, cyclic, non-cyclic with only a single bounce, inflaton field, no inflaton...) and still get bounce.

There must be some underlying principle. Something like "Conventionally we say that quantum applies to the small, when you go down to microscopic nature the laws become quantum, well this is true for going to high density, not only for going very small."

And this principle is somehow embodied in LQG including the spinfoam models, and it may be wrong and it may be right. Nature can like it or not like it.

But I cannot formulate that principle in a clear rigorous way. So it remains vague for me.
That's all there is too it.

Have you looked at Battisti Marciano article that claims to show that the bounce result extends to spinfoam LQG? That is a critical point and they are relatively unknown. There might be some gap in the proof that invalidates it, which if true would be nice to know.
I invite you to have a look at Battisti Marciano spinfoam bounce article.
===================

About "memory". There was this squabble between Bojo and two of Ashtekar's group (Corichi Singh). Ashtekar did not take part. I don't read Bojo LQC as a rule. I try to keep my head clear by focusing on what Ashtekar says. And I try to avoid reading stuff that was written during a squabble as part of a nasty controversy. That is all 2006 stuff and water under the bridge.
 
Last edited:
  • #16
The very high density state of existence (geometry and matter in some puzzling way indistinguishable, the same thing) is what Ashtekar calls "the quantum regime" and we do not know if it has finite spatial extent or infinite. The bounce is of short duration (to the extent that duration is meaningful) in other words the "quantum regime" is brief. It should not be pictured as a point or small volume object since we do not know that. The only volume we can estimate is that of the observable portion of the universe---that indeed would come from a small volume (at such high density) if the theory is right.
But the whole "quantum regime" shebang would likely be far more extensive.

Am I remembering right? Does Ashtekar refer to this strange brief state of existence (the stuff of creation really) as the "quantum regime" or does he call it something else?

I checked and found where he calls it the "deep Planck regime". Surely there can be no one today who understands that brief state of existence (if indeed a bounce occurred.) It is something we can look forward to. Something fundamentally new to understand.

Loop geometry/gravity says that at this high density the stuff of existence has a tendency to expand (repellent gravity) rather than contract, and predicts that it experiences a brief episode of "superinflation" even more rapid than ordinary inflation.
Intuitively, with repellent gravity, the stuff would tend to uniformize itself. Eliminate over/under densities in as much as it can given the very short duration.

This superinflation episode is able to prime the scalar field that drives ordinary inflation so that it will last long enough to get 60 e-folds or whatever the required amount is.
That too is a recent result. 2009 I think. I would not read anything in LQC before 2008, I guess. Unless current review article required it as source material.

Just some random thoughts. Here's a pointer to the Battisti Marciano:
http://arxiv.org/abs/1010.1258
Big Bounce in Dipole Cosmology
Marco Valerio Battisti, Antonino Marciano
(Submitted on 6 Oct 2010)
"We derive the cosmological Big Bounce scenario from the dipole approximation of Loop Quantum Gravity. We show that a non-singular evolution takes place for any matter field and that, by considering a massless scalar field as a relational clock for the dynamics, the semi-classical proprieties of an initial state are preserved on the other side of the bounce. This model thus enhances the relation between Loop Quantum Cosmology and the full theory."
 
Last edited:
  • #17
marcus said:
What I did reply with was my suspicion that something like the HUP (Heisenberg principle) applies to energy density. There is this max density estimated at 41% of Planck. Just a vague notion.

I did try to follow Battisti/Marciano and I also read up on the Corichi Singh cosmic recall controversy. Which probably explains any sense of confusion and shell games I might have. :cry:

There must be some underlying principle. Something like "Conventionally we say that quantum applies to the small, when you go down to microscopic nature the laws become quantum, well this is true for going to high density, not only for going very small."

This comment produced a double take. Surely it is a fact taken as read? The hot and the small are yoked together by HUP. Shrinking the scale factor raises the energy density. In a Planck space, even a single wavelength fluctuation has the Planck energy.

Anyway, what about Bojowald's attempt at a simple explanation in Scientific American 2008?

http://libserver.wlsh.tyc.edu.tw/sa/pdf.file/en/e081/e081p036.pdf

The way he explains repulsive gravity is...

Loop gravity suggests that the atomic structure of spacetime changes the nature of gravity at very high energy densities, making it repulsive. Imagine space as a sponge and mass and energy as water. The porous sponge can store water but only up to a certain amount. Fully soaked, it can absorb no more and instead repels water. Similarly, an atomic quantum space is porous and has a finite amount of storage space for energy. When energy densities become too large, repulsive forces come into play. The continuous space of general relativity, in con-trast, can store a limitless amount of energy.

So gravity is curling space up, but there is a grain that is set above the Planck scale. The grain absorbs all the gravitational curvature it can and then for some reason tries to absorb even more (because it continues to want to contract to the Planck scale?) and this excess gets shunted sideways into hyperbolic or repulsive curvature. At this point we are in the inflation zone.

Sounds hokey, so perhaps you can supply a better interpretation than me?

Then Bojowald makes some statements about the bounce...he seems uncertain that LQC actually demands it after all.

He admits the Planck scale probably results in a quantum foam that scrambles the memory of "the past" universe. So instead of a bounce being a certain prediction of loop cosmology...

Alternatively, before the big bounce the universe may have been in
an almost unimaginable quantum state, not yet spacelike, when
something triggered the big bounce and the formation of the atoms
of spacetime. Which of these two alternatives occurred depends on
further details that physicists are still working

...an alternative that I much prefer as it is my own view of the big bang. That is, a universe self-organising its way out of a vaguer, formless state.

This also avoids any singularities. Infinities are replaced neatly by maximal uncertainties.

So adopting the sponge analogy, the difference is that the sponge dissolves into a disconnected, unoriented, foam of fluctuations at below the scale of the atoms of volume. The curvature is so extreme due to the energy scale that it breaks up the holonomies or gravitational connections.

As best I can make out, the argument for instead a bounce could boil down to the use of difference equations. This would allow the hop across the face of the singularity at the minimum volume scale - a discrete jump to get to the other side where things turn inside out according to the equations?

So many unanswered questions still...
 
Last edited by a moderator:

FAQ: Maximum value of Hubble parameter (near start of expansion)

What is the maximum value of the Hubble parameter near the start of the expansion?

The maximum value of the Hubble parameter near the start of the expansion is not a fixed value and can vary depending on the model used to describe the expansion of the universe. However, the standard cosmological model predicts a maximum value of approximately 10^29 seconds^-1, which corresponds to a time of about 10^-43 seconds after the Big Bang.

How is the maximum value of the Hubble parameter determined?

The maximum value of the Hubble parameter is determined by studying the expansion rate of the universe at very early times, using data from observations of the cosmic microwave background and other astronomical measurements. This data is then used to constrain and refine models of the early universe, which can be used to calculate the maximum value of the Hubble parameter.

What does the maximum value of the Hubble parameter tell us about the early universe?

The maximum value of the Hubble parameter provides important insights into the early universe, as it is directly related to the rate of expansion of the universe and the density of matter and energy present. By studying the maximum value, scientists can better understand the conditions and processes that governed the evolution of the universe in its earliest stages.

How does the maximum value of the Hubble parameter change over time?

The maximum value of the Hubble parameter decreases over time as the universe expands and the density of matter and energy decreases. This is because the Hubble parameter is inversely proportional to the age of the universe, meaning that as the universe ages, the maximum value of the Hubble parameter decreases.

What implications does the maximum value of the Hubble parameter have for the future of the universe?

The maximum value of the Hubble parameter can provide insights into the future of the universe, as it is related to the expansion rate and density of the universe. Depending on the exact value, it can help scientists make predictions about the ultimate fate of the universe, such as whether it will continue to expand forever or eventually collapse in a Big Crunch.

Similar threads

Replies
15
Views
4K
Replies
3
Views
2K
Replies
8
Views
3K
Replies
4
Views
2K
Replies
2
Views
1K
Replies
15
Views
3K
Replies
1
Views
3K
Replies
2
Views
3K
Back
Top