- #1
Suxxor
- 19
- 0
I was watching QI http://www.youtube.com/watch?v=gU7McFm_cKQ&t=7m09s and prof Brian Cox said that time runs roughly 38000 ns per day faster on GPS satellites than on the ground.
From that he concluded that since light travels roughly 1 foot per nanosecond, GPS would generate a positional error 38000 feet per day, if relativity effects weren't compensated.
The same conclusion is brought here: http://www.astronomy.ohio-state.edu/~pogge/Ast162/Unit5/gps.html
***
This kind of reasoning seems incorrect, because GPS satellites are all located in a very similar gravitational field and the ground clock on receiver is constantly reset to follow the more accurate time signals from the satellites ( http://electronics.howstuffworks.com/gadgets/travel/gps3.htm ), so the absolute time on ground would not matter.
As a result, the position error of GPS would be much lower than 38000 feet per day. What do you think?
From that he concluded that since light travels roughly 1 foot per nanosecond, GPS would generate a positional error 38000 feet per day, if relativity effects weren't compensated.
The same conclusion is brought here: http://www.astronomy.ohio-state.edu/~pogge/Ast162/Unit5/gps.html
***
This kind of reasoning seems incorrect, because GPS satellites are all located in a very similar gravitational field and the ground clock on receiver is constantly reset to follow the more accurate time signals from the satellites ( http://electronics.howstuffworks.com/gadgets/travel/gps3.htm ), so the absolute time on ground would not matter.
As a result, the position error of GPS would be much lower than 38000 feet per day. What do you think?