The new time standard isn't your grandfather's atomic clock. Image: Duncan McNeil/Flickr
Throw out that lame old atomic clock that's only accurate to a few tens of quadrillionths of a second. The U.S. has introduced a new atomic clock that is three times more accurate than previous devices.
Atomic clocks are responsible for synchronizing time for much of our technology, including electric power grids, GPS, and the watch on your iPhone. On Apr. 3, the National Institute of Standards and Technology (NIST) in Boulder, Colorado officially launched their newest standard for measuring time using the NIST-F2 atomic clock, which has been under development for more than a decade.
"NIST-F2 is accurate to one second in 300 million years," said Thomas O'Brian, who heads NIST's time and frequency division, during a press conference Apr. 3. The clock was recently certified by theInternational Bureau of Weights and Measures as the world's most accurate time standard.
The advancement is more than just a feather in the cap for metrology nerds. Precise timekeeping underpins much of our modern world. GPS, for instance, needs accuracy of about a billionth of a second in order to keep you from getting lost. These satellites rely on high precision coming from atomic clocks at the U.S. Naval Observatory (which maintains U.S. military time standards). GPS, in turn, is used for synchronizing digital networks such as cell phones and the NTP servers that provide the backbone of the internet.
Your smartphone doesn't display the time to the sixteenth decimal place, but it still relies on the frequency standards coming from NIST's clocks, which make their measurements while living in a tightly controlled lab environment. Real world clocks must operate under strained conditions such as temperature swings, significant vibration, or changing magnetic fields that degrade and hamper their accuracy. It's important then that the ultimate reference standard has much better performance than the real world technologies.
Both NIST-F2 and the standard it replaces, NIST-F1, are known as cesium-based atomic fountain clocks. This means they determine the length of a second by measuring a natural vibration inside a cesium atom. Within the clock, lasers push together a ball of 10 million cesium atoms and cool them to near absolute zero (which helps reduce noise). The ball is tossed up in a 3-foot chamber, passing through a microwave beam. The microwave beam kicks some of the cesium atoms up into a higher energy state, which causes them to emit light.
The cesium ball is tossed up and down several times, slightly changing the wavelength of the microwave beam each time. Engineers are doing this to search for a particular frequency. They know they've found the right one when the microwaves kick up the most of the atoms, producing the maximum amount of light. This is then known to be 9,192,631,770 Hz, a natural resonance frequency of cesium, which defines the length of the second in our modern world.
The previous generation of atomic clock was already quite good at figuring out the length of a second but had a few small sources of error. NIST-F1 operates at room temperature and so the walls of the chamber in which the cesium atom ball is tossed heat up, emitting a small amount of radiation. This interferes with the atoms, causing them to shift ever so slightly in their energy levels. By cooling NIST-F2 with liquid nitrogen, the new timepiece reaches temperatures of – 316 degrees Fahrenheit, virtually eliminating this excess radiation and reducing the shifting 100-fold.
After steady improvement since atomic clocks were invented in the 1950s, researchers think they are reaching the limit of accuracy with the technology. Any clock that is more precise would begin to feel subtle effects explained by Einstein's theory of relativity. Clocks experience a gravitational warping from massive objects. The Earth, an extremely massive object, causes clocks closer to its surface to run slower relative to those above it. Cesium atoms in fountain clocks actually experience time differently at the top of the 3-foot chamber than at the bottom. An extremely precise measurement device would be confused by this slight difference, making its time keeping inaccurate.
"In the not to distant future, we will end up redefining the second," said Steve Jefferts, who led the NIST project to develop the new atomic clock.
In 1967, scientists got together and defined one second as equivalent to the time it takes a cesium atom to move 9,192,631,770 times between two particular energy levels. If engineers want to get more accurate, they will have to find some other natural process that can be used to measure time and it "will require a whole bunch of people agreeing for that to happen," said Jefferts. There is currently a large international push to find a method of even better timekeeping, though exactly which one will prevail is yet to be seen.
What will we do once we reach the ability to break down time into super-tiny, hyper-accurate units? Nobody knows. But both O'Brian and Jefferts point out that the technological applications of today's atomic clocks weren't apparent when they were first invented. Tomorrow's time measurements could similarly bring a whole slew of new technologies with them.
For further explanations and speculations on time measurement from the USNO, check out the great video below posted a couple months ago from The Atlantic.