- #1
Mu naught
- 208
- 2
Probably not but I'm too dumb to figure it out. I'll describe the scenario as clearly and concisely as I can:
Suppose a space probe is sent out into space moving away from Earth at a relatively high velocity, but not high enough that there are any significant time dilation effects.
Without worrying about the details, suppose on this probe there is a video camera which is focused on a clock that is set to keep Earth time. The probe sends back an uninterrupted video feed of this clock, so someone on Earth can continually observe the time the clock reads on board the space probe.
As the probe goes out further and further into space, the time it takes for the radio signal it is transmitting back to Earth gets longer and longer, causing a greater and greater delay between the probe and earth. After traveling for a long time, suppose the probe is 4 light hours away from earth.
The observer on Earth must see the clock as running 4 hours behind because that is the time delay between he and the probe. Yet, suppose he has been watching the clock the whole time the probe has been traveling out into space. When it began its journey, the time was accurate, but now he sees it as running 4 hours behind.
How can this be? If the clock is slowing down on the space craft, then he would see it ticking slower from the video feed, but the clock isn't slowing down, it's keeping perfect time with earth. And the video feed being transmitted is constantly sending back signals at a constant rate - say 30 transmissions per second. Where does the time delay come from?
Suppose a space probe is sent out into space moving away from Earth at a relatively high velocity, but not high enough that there are any significant time dilation effects.
Without worrying about the details, suppose on this probe there is a video camera which is focused on a clock that is set to keep Earth time. The probe sends back an uninterrupted video feed of this clock, so someone on Earth can continually observe the time the clock reads on board the space probe.
As the probe goes out further and further into space, the time it takes for the radio signal it is transmitting back to Earth gets longer and longer, causing a greater and greater delay between the probe and earth. After traveling for a long time, suppose the probe is 4 light hours away from earth.
The observer on Earth must see the clock as running 4 hours behind because that is the time delay between he and the probe. Yet, suppose he has been watching the clock the whole time the probe has been traveling out into space. When it began its journey, the time was accurate, but now he sees it as running 4 hours behind.
How can this be? If the clock is slowing down on the space craft, then he would see it ticking slower from the video feed, but the clock isn't slowing down, it's keeping perfect time with earth. And the video feed being transmitted is constantly sending back signals at a constant rate - say 30 transmissions per second. Where does the time delay come from?