- #36
Mike2
- 1,313
- 0
SpaceTiger said:Nah, the big rip won't occur with a cosmological constant, it only occurs with phantom energy, as hellfire said. Basically, it happens because the dark energy density is increasing with time (rather than being constant, as with [itex]\Lambda[/itex]), eventually becoming larger than the binding energy of the universe's constituents (galaxies, atoms, etc.) and tearing them apart.
This issue is a stumbbling block for me. I must be missing something fundamental, and I can't proceed until it's resolved. I don't seem to be making my question very clear. So let me try again.
Even before the Friedmann-Robertson-Walker metric or the application of GR, the expansion rate was measured by the red shift of distant galaxies. Eventually, this recession rate was pinned down to H = 72km/(sec*Mpc) - the result of direct measurement. This means that objects that are 1Megaparsec away are receding at 72km/sec. Objects that are 2Mpc away are receding at 144km/sec. Ultimately there is a distance that is receding at the speed of light - the Hubble sphere. Am I right so far?
Then came the observation that the rate of expansion is not constant, that more distance supernovae were dimmer than expected because they had receded faster than expected of a linear Hubble law. This means that points of space had a rate of recesion that changed. Is this right so far?
OK, then doesn't that mean the Hubble rate of 72km/(sec*Mpc) is not constant but has changed?
Last edited: