A brief, 20,000-year history of timekeeping
As clocks got more accurate, we had to redefine the second.
Over millennia, humankind’s time-tracking has grown increasingly precise. Sundials divided days into hours. Clocks broke hours into quarters and minutes, and finally minutes into seconds. As timepieces evolved, so did scientists’ need for ever-more-exact tickers. They developed devices that relied not on Earth’s wobbly rotation, but on microscopic atomic movements. At the heart of it all is an ever-advancing appreciation for our smallest temporal unit, the second. Modern systems like GPS and cellphones rely on keeping this interval consistent, which makes defining and refining it, well, of the essence.
A hash-marked bone found in the Semliki Valley in the Democratic Republic of the Congo might be the earliest human attempt to count the days. Ten thousand years later, in what’s now Scotland, humans dug moon-shaped pits to track the lunar cycle.
Humans cut days into smaller units by tracking the sun with shadow-casting obelisks and rods. Nearly 2,000 years later, Egyptians refined that method into the earliest known sundial. Babylonian, Greek, Chinese, and Meso- American versions followed.
By slowly flowing water from one vessel into another and measuring the liquid level against marked intervals, Egyptians could see how much time had passed—without using sunlight. Similar methods relied on sand, burnt incense, or scored candles.
By the 13th century, the equinox was 11 days out of sync with the Julian calendar. To rectify the error, English philosopher Roger Bacon used the slivers on Ptolemy’s subdivided globe as units of time. Now one second meant 1/86,400 of a solar day.
The first known mechanical clock was invented by Chinese monk Yi Xing and scholar Liang Lingzan. As flowing water spun a wheel, an interlocking system of rods and levers marked the time with a drumbeat every quarter and a bell every full hour.
To help astronomers track stars, Egyptian mathematician Ptolemy mapped the sky onto a globe. He divided each degree of longitude (360 in total) into 60 segments called minutes, and each of those into a further 60 smaller slivers: seconds.
A 15th-century French duke may have owned the first clock to drive its gears with a spring instead of water or weights. The design allowed for compact timepieces like pocket watches, and boosted accuracy. Later versions only dropped four minutes a day.
As springs unwound, they became inaccurate, causing problems for precision-craving astronomers like Galileo. So 17th-century Dutch scientist Christiaan Huygens built a pendulum clock. Its 3-foot-long swinging weight lost only one minute per day.
Gravity can slow pendulums, but researchers at Bell Laboratories found that electrified quartz crystal vibrates more consistently. Early models erred by one-third of a second each year, allowing for precision measurements like tracking gravity at sea.
Visible light, which lets us detect faster vibrations than microwaves do, led to optical clocks that err just a second every 140 million years. Too fragile to run longer than a few days, these tickers could eventually cause a redefinition of the second. Again.
Atomic clocks made a more-precise second possible, though it took nearly two decades for researchers to agree on a standard. Finally, they matched a second to the precise frequency of energy a cesium atom releases when its electrons jump.
The Persistence of Memory, by Salvador Dali
Atoms resonate even more reliably than quartz. Using microwaves to track these oscillations, the National Bureau of Standards made a timer accurate to one second in eight months. Today’s most advanced cesium clock loses a second per 300 million years.
This article was originally published in the September/October 2017 Mysteries of Time and Space issue of Popular Science.