atomic clocks

An atomic clock is a clock that measures time by monitoring the resonant frequency of atoms. It is based on atoms having different energy levels. Electron states in an atom are associated with different energy levels, and in transitions between such states they interact with a very specific frequency of electromagnetic radiation. This phenomenon serves as the basis for the International System of Units' (SI) definition of a second:

The second, symbol s, is the SI unit of time. It is defined by taking the fixed numerical value of the caesium frequency,



Δ

ν

Cs




{\displaystyle \Delta \nu _{\text{Cs}}}

, the unperturbed ground-state hyperfine transition frequency of the caesium-133 atom, to be 9192631770 when expressed in the unit Hz, which is equal to s−1.

This definition is the basis for the system of International Atomic Time (TAI), which is maintained by an ensemble of atomic clocks around the world. The system of Coordinated Universal Time (UTC) that is the basis of civil time implements leap seconds to allow clock time to track changes in Earth's rotation to within one second while being based on clocks that are based on the definition of the second, though leap seconds will be phased out in 2035.
The accurate timekeeping capabilities of atomic clocks are also used for navigation by satellite networks such as the European Union's Galileo Programme and the United States' GPS. The timekeeping accuracy of the involved atomic clocks is important because the smaller the error in time measurement, the smaller the error in distance obtained by multiplying the time by the speed of light is (a timing error of a nanosecond or 1 billionth of a second (10−9 or 1⁄1,000,000,000 second) translates into an almost 30-centimetre (11.8 in) distance and hence positional error).
The main variety of atomic clock uses caesium atoms cooled to temperatures that approach absolute zero. The primary standard for the United States, the National Institute of Standards and Technology (NIST)'s caesium fountain clock named NIST-F2, measures time with an uncertainty of 1 second in 300 million years (relative uncertainty 10−16). NIST-F2 was brought online on 3 April 2014.

View More On Wikipedia.org
Back
Top