- #1
NanakiXIII
- 392
- 0
It seems quite common to more or less equate gravitational redshift to gravitational time dilation. For example, the two might be explained as follows.
We have two observers. A is standing on the surface of the earth, B on the top of a tower. A sends light up to B with some frequency, B receives the light with a lower frequency - the light has been redshifted because it's losing energy. Now B, after comparing the frequencies (maybe he has an identical light source), thinks that his units of time (derived from the frequency of his light source) are shorter than those of A (derived from the frequency of the light coming upwards). Thus there is time dilation.
This seems like a silly argument to me. The red-shift of the light feels more like a Doppler effect than like something that is inherent to spacetime. If, classically, someone is creating a sound while moving away from me, I hear it at a lower frequency than they are transmitting it at. That doesn't mean that I think their clock must be going more slowly. The sound is just Doppler-shifted. The fact that you even know that the signal has been Doppler-shifted means that it is not equivalent to time dilation because if your units of time were adjusted to the frequency, you would be measuring the same frequency as the person sending out the signal.
The gravitational redshift seems like the same kind of thing. Someone sends out a signal with a frequency and in the process of reaching you, the frequency is lowered. Then you receive it and measure the lowered frequency. It has nothing to do with time dilation, the light just lost energy. The only way you could call this time dilation is if you somehow demand that the signal was not changed along the way.
This all in contrast to time dilation in SR, where it follows from the fundamental assumption that light moves at the same velocity for all observers, which has nothing to do with a signal being altered - the Doppler effect is a separate concept with a separate effect on things.
All right, so the above is up for criticism. If I'm wrong about any of it, I'd be happy to hear it. However, it seems pretty clear cut and it is not what is puzzling me at the moment. The problem I'm having is how to reconcile all that with the fact that there does seem to be a time dilation equation in General Relativity, which just follows from the Schwarzschild metric. You can simply determine the interval between two events at the same location in a gravitational field and the metric will relate the proper time interval to the time interval as viewed by an observer at a different distance from the mass.
[tex]\tau = t \sqrt{1-\frac{r_s}{r}}[/tex]
This suggests that the time dilation must also be true for things other than light. For example, if your signal consists of light pulses at a certain interval, then that would need to be red-shifted as well, even though the energy in the signal has nothing to do with the frequency. This suggests a true time dilation.
But we can't have it both ways. If there is this true time dilation, then light must be redshifted both by this effect and by the fact that it loses energy. Then the red-shift equation used (which is just the dilation equation above cast into a different form) is wrong - it's missing the loss-of-energy part.
So now I'm stuck. I don't know how to reconcile all of this, or where the error in my thinking is. I welcome any insight you could offer.
We have two observers. A is standing on the surface of the earth, B on the top of a tower. A sends light up to B with some frequency, B receives the light with a lower frequency - the light has been redshifted because it's losing energy. Now B, after comparing the frequencies (maybe he has an identical light source), thinks that his units of time (derived from the frequency of his light source) are shorter than those of A (derived from the frequency of the light coming upwards). Thus there is time dilation.
This seems like a silly argument to me. The red-shift of the light feels more like a Doppler effect than like something that is inherent to spacetime. If, classically, someone is creating a sound while moving away from me, I hear it at a lower frequency than they are transmitting it at. That doesn't mean that I think their clock must be going more slowly. The sound is just Doppler-shifted. The fact that you even know that the signal has been Doppler-shifted means that it is not equivalent to time dilation because if your units of time were adjusted to the frequency, you would be measuring the same frequency as the person sending out the signal.
The gravitational redshift seems like the same kind of thing. Someone sends out a signal with a frequency and in the process of reaching you, the frequency is lowered. Then you receive it and measure the lowered frequency. It has nothing to do with time dilation, the light just lost energy. The only way you could call this time dilation is if you somehow demand that the signal was not changed along the way.
This all in contrast to time dilation in SR, where it follows from the fundamental assumption that light moves at the same velocity for all observers, which has nothing to do with a signal being altered - the Doppler effect is a separate concept with a separate effect on things.
All right, so the above is up for criticism. If I'm wrong about any of it, I'd be happy to hear it. However, it seems pretty clear cut and it is not what is puzzling me at the moment. The problem I'm having is how to reconcile all that with the fact that there does seem to be a time dilation equation in General Relativity, which just follows from the Schwarzschild metric. You can simply determine the interval between two events at the same location in a gravitational field and the metric will relate the proper time interval to the time interval as viewed by an observer at a different distance from the mass.
[tex]\tau = t \sqrt{1-\frac{r_s}{r}}[/tex]
This suggests that the time dilation must also be true for things other than light. For example, if your signal consists of light pulses at a certain interval, then that would need to be red-shifted as well, even though the energy in the signal has nothing to do with the frequency. This suggests a true time dilation.
But we can't have it both ways. If there is this true time dilation, then light must be redshifted both by this effect and by the fact that it loses energy. Then the red-shift equation used (which is just the dilation equation above cast into a different form) is wrong - it's missing the loss-of-energy part.
So now I'm stuck. I don't know how to reconcile all of this, or where the error in my thinking is. I welcome any insight you could offer.