- #1
gatztopher
- 26
- 0
- TL;DR Summary
- The speed of light is always the same no matter how fast you're going, so the closer to the speed of light you get, the slower time passes for you so that light you see continues to go the speed it's supposed to.
I thought of this description recently and I think it's pretty intuitive, but I've gotten some side eye telling it to friends and family (maybe because relativity is screwy, maybe because I'm confused, maybe both?) so I want to get some confirmation that it's reasonable. Here goes:
If you're in a car and there's a car going faster than you, its speed from your point of view is its speed minus your speed. So, say, if you're going 20mph and it's going 25mph, then it looks like it's just going 5mph. The speed of light, however, looks like it's going the same speed no matter how fast you're going. So what happens if you're going the speed of light minus 5mph? Instead of light looking like it's going 5mph, the passage of time slows down for you until you see light going the speed it's supposed to. That's time dilation.
What do you all think, does this explanation track?
If you're in a car and there's a car going faster than you, its speed from your point of view is its speed minus your speed. So, say, if you're going 20mph and it's going 25mph, then it looks like it's just going 5mph. The speed of light, however, looks like it's going the same speed no matter how fast you're going. So what happens if you're going the speed of light minus 5mph? Instead of light looking like it's going 5mph, the passage of time slows down for you until you see light going the speed it's supposed to. That's time dilation.
What do you all think, does this explanation track?