- #1
Prometeus
- 43
- 3
Some time ago there was a similar thread
https://www.physicsforums.com/threa...me-dilation-and-equivalence-principle.929838/
but what I want to discuss is similar but not the same and I would like to specify my question in such way, that it hopefully won't go sideways as in cited previous thread.
So we have two labs.
First lab is on surface of Earth, time dilation is equivalent to gravitational potential of surface of Earth, gravity is 1 g.
Second lab is on spaceship which is let's say 1 light year away and there is no star or planet around. The starship accelerates with proper acceleration 1 g and its speed relative to Earth is quite low, let's say 0,00001 % of speed of light. So time dilation relative to Earth caused by speed is close to zero. Due to relativity theory, acceleration causes no time dilation, so there all together there is no time dilation on the spaceship.
Now both labs have equipment (accelerator?) which can produce muons. Muons are created in both labs and they decay in time t1 in first lab (on Earth) and time t2 in second lab (in spaceship). Based on my understanding of relativity, gravitational time dilation on Earth is very slightly slowing down the decay of muons on Earth and therefore time t1 is bigger than time t2 on spaceship, where is no time dilation and muons decay faster than on Earth.
Just to make the example precise, muons don't jump up and down in the ship, so I will just ignore front and rear potential time difference in the spaceship.
So finally, based on Equivalence principle there should be no measurable differences between proper acceleration and gravity, but in the described example the scientist which knows how much time it takes for muon to decay on Earth can distinguish if the lab is on Earth or inside a spaceship.
What is wrong in my statement that you can use muons to measure difference between acceleration and gravity?
https://www.physicsforums.com/threa...me-dilation-and-equivalence-principle.929838/
but what I want to discuss is similar but not the same and I would like to specify my question in such way, that it hopefully won't go sideways as in cited previous thread.
So we have two labs.
First lab is on surface of Earth, time dilation is equivalent to gravitational potential of surface of Earth, gravity is 1 g.
Second lab is on spaceship which is let's say 1 light year away and there is no star or planet around. The starship accelerates with proper acceleration 1 g and its speed relative to Earth is quite low, let's say 0,00001 % of speed of light. So time dilation relative to Earth caused by speed is close to zero. Due to relativity theory, acceleration causes no time dilation, so there all together there is no time dilation on the spaceship.
Now both labs have equipment (accelerator?) which can produce muons. Muons are created in both labs and they decay in time t1 in first lab (on Earth) and time t2 in second lab (in spaceship). Based on my understanding of relativity, gravitational time dilation on Earth is very slightly slowing down the decay of muons on Earth and therefore time t1 is bigger than time t2 on spaceship, where is no time dilation and muons decay faster than on Earth.
Just to make the example precise, muons don't jump up and down in the ship, so I will just ignore front and rear potential time difference in the spaceship.
So finally, based on Equivalence principle there should be no measurable differences between proper acceleration and gravity, but in the described example the scientist which knows how much time it takes for muon to decay on Earth can distinguish if the lab is on Earth or inside a spaceship.
What is wrong in my statement that you can use muons to measure difference between acceleration and gravity?