- #1
Troodon Roar
- 7
- 1
I am confused about time dilation. I understand that a common pedagogical device used is a light clock, in which a pulse of light flashes back and forth between two objects. I understand that when an observer in a vehicle carrying such a contraption is moving, to an external observer, as the light has to traverse a longer, diagonal path, and it cannot exceed speed c, it must take more time to make a tick, and each tick of the clock would take a longer time. I understand all of that.
The issue I'm having trouble comprehending is that, somehow, taking more time between ticks would result in less time elapsing for the moving observer than for the stationary observer. And this has nothing to do with the principle of relativity or the twins paradox or anything like that, as I know that the acceleration of the moving twin furnishes the answer to that particular paradox.
What I am confused about is that, since the moving observer's clock has more time between ticks, wouldn't MORE time, not less, elapse for the moving observer than for the stationary one? I cannot wrap my mind around how taking a *longer time* to tick could possibly translate into less time, overall, elapsed. Is the number of ticks so drastically reduced that it overcompensates, perhaps? Or is ticking in one direction a lot more time-consuming than ticking in the opposite direction, and the shorter half of the tick overcompensates? Or something else? What is it that I am missing here?
Basically, to sum up, I feel like it logically seems to me that, in time dilation, the reverse of what is said to happen ought to happen; the stationary observer ought to have less time elapse for them, and the moving observer ought to have more time elapse for them. Any explanations? Thank You.
The issue I'm having trouble comprehending is that, somehow, taking more time between ticks would result in less time elapsing for the moving observer than for the stationary observer. And this has nothing to do with the principle of relativity or the twins paradox or anything like that, as I know that the acceleration of the moving twin furnishes the answer to that particular paradox.
What I am confused about is that, since the moving observer's clock has more time between ticks, wouldn't MORE time, not less, elapse for the moving observer than for the stationary one? I cannot wrap my mind around how taking a *longer time* to tick could possibly translate into less time, overall, elapsed. Is the number of ticks so drastically reduced that it overcompensates, perhaps? Or is ticking in one direction a lot more time-consuming than ticking in the opposite direction, and the shorter half of the tick overcompensates? Or something else? What is it that I am missing here?
Basically, to sum up, I feel like it logically seems to me that, in time dilation, the reverse of what is said to happen ought to happen; the stationary observer ought to have less time elapse for them, and the moving observer ought to have more time elapse for them. Any explanations? Thank You.