- #1
jeffbarrington
- 24
- 1
I have marked this as high school level, although I am studying an undergraduate general relativity course, and just want to get some basics right.
Whenever I look for a 'cheap and dirty' method of calculation for the age of the universe, with a Hubble constant not changing with time, I am met with:
t = d/v
where I am meant to believe that t is the age of the universe, d is the separation between two galaxies at some great distance apart, and v is their current speed of separation, given by v = H_0d. However, they haven't always been moving at this speed apart from each other; up until now, they have been traveling at a speed < H_0d. Why are these 'derivations' failing to point out this flaw and what is the workaround?
Thanks
Whenever I look for a 'cheap and dirty' method of calculation for the age of the universe, with a Hubble constant not changing with time, I am met with:
t = d/v
where I am meant to believe that t is the age of the universe, d is the separation between two galaxies at some great distance apart, and v is their current speed of separation, given by v = H_0d. However, they haven't always been moving at this speed apart from each other; up until now, they have been traveling at a speed < H_0d. Why are these 'derivations' failing to point out this flaw and what is the workaround?
Thanks