- #1
wabbit
Gold Member
- 1,284
- 208
This is a follow up to another thread (https://www.physicsforums.com/threads/something-about-calculating-the-age-of-the-universe.807250/), but I post it as a separate thread since it is not clear to me (a) this is actually correct, and (b) assuming it is, whether it actually provides a meaningful result when applied to the data - these are my two questions here.
Models based on GR are what is actually used in combination with observations for serious estimations of the age of the universe, but I was wondering : can we also estimate, with less precision of course, the age of the universe from observational data alone, without referring to GR or FRW ? The purpose isn't really to do without a model as such, it is more to do it with only elementary tools, so no ODE either, and the integral below might perhaps be replaced with a finite sum too.
It seems to me, barring egregious errors below, we can - or at least we can estimate how long ago the light from ancient galaxies we see originated.
If we assume that the Hubble law is due to expansion, and that recession velocities are due to that alone, the observational data from supernova studies gives us a relation between :
- velocities or more precisely redshifts which directly give the scale factor ## a(t)=\frac{\lambda_{emitted}} {\lambda_{observed}}## (t being an unknown)
- distances derived from the apparent luminosity ## L ## of standard candles, ## d=\sqrt{\frac{L_{standard}}{L_{observed}} }##
The distance measured this way is, if I am not mistaken, $$ d=c\int_{t_{emitted}}^{t_{now}}\frac{dt }{a(t)}$$ so that after smoothing the scale-distance observed relation, we have approximately, over a "small" interval
$$ \Delta d=\frac{c}{a} \Delta t, \text{ i.e. } \Delta t=\frac{a}{c}\Delta d $$
This gives us a "model free" estimate of the time elapsed between different redshifts, and summing over this gives the time from the most distant galaxy where a standard candle can be seen.
We can also extrapolate the relation between distance and redshift beyond the observed range, and if we extrapolate back to zero scale, we get a sum that is an estimate of the age of the universe.
Ignoring GR isn't of course very smart, and extrapolation is notoriously unreliable. I am obviously not claiming this as a substitute for better, more sophisticated approaches. Still, I find it interesting to see what can be obtained in this way. However I do not have the actual data, so I don't know what the result might be.
So does this work ?Edited the post above which used S(t) as I realized it is better here to forget about stretch factors and use a(t) instead.
Models based on GR are what is actually used in combination with observations for serious estimations of the age of the universe, but I was wondering : can we also estimate, with less precision of course, the age of the universe from observational data alone, without referring to GR or FRW ? The purpose isn't really to do without a model as such, it is more to do it with only elementary tools, so no ODE either, and the integral below might perhaps be replaced with a finite sum too.
It seems to me, barring egregious errors below, we can - or at least we can estimate how long ago the light from ancient galaxies we see originated.
If we assume that the Hubble law is due to expansion, and that recession velocities are due to that alone, the observational data from supernova studies gives us a relation between :
- velocities or more precisely redshifts which directly give the scale factor ## a(t)=\frac{\lambda_{emitted}} {\lambda_{observed}}## (t being an unknown)
- distances derived from the apparent luminosity ## L ## of standard candles, ## d=\sqrt{\frac{L_{standard}}{L_{observed}} }##
The distance measured this way is, if I am not mistaken, $$ d=c\int_{t_{emitted}}^{t_{now}}\frac{dt }{a(t)}$$ so that after smoothing the scale-distance observed relation, we have approximately, over a "small" interval
$$ \Delta d=\frac{c}{a} \Delta t, \text{ i.e. } \Delta t=\frac{a}{c}\Delta d $$
This gives us a "model free" estimate of the time elapsed between different redshifts, and summing over this gives the time from the most distant galaxy where a standard candle can be seen.
We can also extrapolate the relation between distance and redshift beyond the observed range, and if we extrapolate back to zero scale, we get a sum that is an estimate of the age of the universe.
Ignoring GR isn't of course very smart, and extrapolation is notoriously unreliable. I am obviously not claiming this as a substitute for better, more sophisticated approaches. Still, I find it interesting to see what can be obtained in this way. However I do not have the actual data, so I don't know what the result might be.
So does this work ?Edited the post above which used S(t) as I realized it is better here to forget about stretch factors and use a(t) instead.
Last edited: