How does GPS receiver calculate time delay?

In summary, the GPS receiver uses signals from multiple satellites to calculate its own location and time. It does this by comparing the time encoded in each satellite's signal to its own internal clock, and adjusting its clock until all signals intersect at one point. The more signals available, the more accurate the measurement will be.
  • #1
aadarsh9
14
0
The GPS satellite has an atomic clock which sychronizes with the receiver. But when the receiver gets the time, it is delayed as it travelled. So when its 08h 00min 05s in the satellite, the receiver gets maybe 08h 00min 03s. The time of travel of the signal is 2s but how will the receiver calculate this time difference without knowing the real time?
 
Physics news on Phys.org
  • #2
1st: the travel time is much less than 2s.
2nd: The receive can figure out where it is (that's the whole point of the GPS system isn't it?) and can correct the delay
 
  • #3
Well to figure out where you are requires the time of travel of the signal first. If the gps receive worked 2 way, I understand how it could work: the satellite bounce off the signal of the receiver like how a sonar works to calculate the time difference. But the gps is only a receiver and I can't figure out how it can find the travel time of the signal. The only thing I can say is that the one who invented it is a genius. Anyone can help me understand it please?
 
  • #4
The GPS satellites have their own clocks that they reference when they send a signal. This signal includes the time it was sent and the position of the satellite. The receiver uses signals from multiple satellites to calculate where it's at. The receiver does not communicate back to the satellite.

See here: http://www.physics.org/article-questions.asp?id=55
 
  • #5
@Drakkith
I knew that but how does it calculate the time of travel of the signal if the gps receiver only receives signals?
 
  • #7
. In a nutshell, the
receiver looks at incoming signals from four or
more satellites and gauges its own inaccuracy. In
other words, there is only one value for the
"current time" that the receiver can use. The
correct time value will cause all of the signals
that the receiver is receiving to align at a single
point in space. That time value is the time value
held by the atomic clocks in all of the satellites.
So the receiver sets its clock to that time value,
and it then has the same time value that all the
atomic clocks in all of the satellites have.

Still didn't grab the concept... I read much stuff on the internet and they are either too complicated or too childish!
 
  • #8
Going over the article as a whole may help, especially the 2nd page where it explains trilateration. Read through the whole article a few times and if you still have problems come back.
 
  • #9
The delay cannot be calculated if the satellite is not perfectly synchronised with the receiver. The article states that both starts transmitting PRN at midnight but their time needs to be synchronised first. The time is sent via radio waves and so there will be a delay and it will not be synchronised. Actually it will never be synchtonised because on any information it sends, there will be a delay.
 
  • #10
That description is incorrect. The "long, digital pattern" mentioned there is the P/Y signal used only by the military GPS receivers; this sequence is secret and cannot be used by civilian receivers. There is also the C/A signal that uses a very short digital sequence, which repeats every millisecond. It is not secret and it is used by civilian receivers. Because it is so short, a receiver can "lock into" it fairly quickly. As soon as the lock-in is achieved, the receiver can read the navigation message from the signal. The message is composed of a number of frames, and each frame contains the satellite's local time at the time the frame is sent. The difference between the receiver's local time and the satellite's local time is the delay.

This delay, however, has an indefinite error, which includes an error in the local time. The receiver obtains the position and time signals from multiple satellites and its solves a system of equations, which then reveals the local time error and so the local clock can be adjusted. Then the process is repeated many many times, which minimizes the error in time (and position).
 
  • Like
Likes Arlen
  • #11
The receiver compares the time encoded in the satellites signal to it's own internal clock to compute a distance to the satellite.

If the receiver gets a signal from 1 satellite it can determine that it is somewhere on the surface of a sphere of a given radius. It also know that this measurement comes with an error of unknown magnitude so it's pretty useless

If the receiver gets 2 signals it can compute 2 spheres which will intersect on a circle. It now knows it is somewhere on that circle but still knows nothing about the error of the measurement.

When 3 signals are present the location can be narrowed to a pair of points. One of these 2 points is usually not near the surface of the Earth and so is discarded as not plausible. If the receiver had a perfect clock that would be all that was necessary.

A 4'th signal will disagree with the other 3 unless the receivers clock and the satellites clocks are perfectly synchronized. The receiver can now adjust it's own clock until all 4 spheres intersect at 1 point. This not only gives the receiver it's exact location, but also the exact time.

When 5 or more signals are available the receiver can compute its location based on every possible combination of 4 of them. The variance between computed locations gives an indication of the accuracy of the measurement.
 

FAQ: How does GPS receiver calculate time delay?

How does a GPS receiver calculate time delay?

A GPS receiver calculates time delay by measuring the time it takes for a signal to travel from a satellite to the receiver. The receiver compares the time the signal was sent from the satellite to the time it was received, and uses this information to calculate the distance between the satellite and the receiver. This distance, along with the known speed of the signal, is used to calculate the time delay.

What is the role of atomic clocks in calculating time delay in GPS receivers?

Atomic clocks are used in GPS satellites to provide extremely accurate and precise time measurements. The GPS receiver uses these time measurements to determine the exact time the signal was sent from the satellite, which is crucial in calculating the time delay between the satellite and the receiver.

How many satellites are needed for a GPS receiver to calculate time delay?

A GPS receiver needs signals from at least four satellites to accurately calculate time delay. This is because three satellites are needed to determine the receiver's position in three-dimensional space, and the fourth satellite is used to synchronize the receiver's clock with the atomic clocks in the satellites.

Can atmospheric conditions affect the accuracy of time delay calculations in GPS receivers?

Yes, atmospheric conditions such as ionospheric and tropospheric delays can affect the speed of the signal and therefore impact the accuracy of time delay calculations in GPS receivers. However, GPS receivers have algorithms and techniques in place to compensate for these effects and maintain accurate time delay calculations.

How does the time delay calculated by a GPS receiver help determine location?

The time delay calculated by a GPS receiver, in combination with the known positions of the satellites, is used to determine the distance between the receiver and each satellite. By knowing the distances to at least four satellites, the receiver can use trilateration to pinpoint its exact location on Earth's surface.

Similar threads

Replies
103
Views
3K
Replies
15
Views
3K
Replies
1
Views
1K
Replies
1
Views
651
Replies
5
Views
2K
Replies
31
Views
4K
Back
Top