Think of the most fussy science teacher you ever had. The one who docked your grade if the sixth decimal place in your answer was rounded incorrectly. Now imagine work that even that teacher would hate for being too anal-retentive. That's the kind of work that goes into defining the meter, the second, and other international standard units of measure.
Here's a look at some ridiculously precise standards and the role that elements have played in defining, redefining, and re-redefining them over the ages.
There are seven base units in the international metric system, and over the past century, metrologists (people for whom measurement isn't the start of science—it is science) have gotten increasingly picky about defining these seven quantities. And it turns out that some of the best tools metrologists have to make measurements are elements on the periodic table. Unlike even the top measuring instruments, elements are exactly the same everywhere, allowing for perfectly reproducible results. And the sheer variety of the table ensures that, no matter what obscure task you have in mind, there's probably an element for that.
The first definition of the meter wasn't bad, for the 1790s: Exactly one ten-millionth of the distance between the Equator and the North Pole, as measured through Paris. Unfortunately, scientists botched the measurement, and the length of the meter that came into common use was later found to be 0.2 mm off the supposed definition, an intolerable gap.
So in 1889 scientists replaced the meridian definition with a long bar made of the elements platinum and iridium. Someone made a scratch near one end of the bar, then made a scratch near the other, and the distances between the scratches became, from then on, 1.000000... meter, to as many decimal places as you like.
But defining a meter this way only evoked more questions. Like what temperature are we talking about? Things expand when they heat up, after all. And what's the geometry here? A rod that length will droop if not supported properly, and will droop differently depending on where it's supported. To head off any ambiguities, scientists decided the rod had to be measured at 0°C and standard atmospheric pressure, and supported on two cylinders one centimeter in diameter, with each of the cylinders in the same horizontal plane and 571 millimeters apart.
Naturally, this definition was no good, either. For one, it's questionable to use centimeters and millimeters to define a meter. For another, on a microscopic scale the scratches have their own width—where does the measurement start? Even worse, metrologists hated that the definition relied on an artifact, a man-made object, since this was supposed to be a universal unit, not the property of one country. (Indeed, the fact that scientists from other countries sometimes had to hike it to Paris and cool things down to 0°C and make their own scratches on an identical rod and bring it back home was a hindrance to spreading the standard.)
What metrologists coveted was an "operational" definition—they wanted to discover a physical process that would produce something with a magnitude of exactly one meter every time. To put it more colloquially, and anachronistically, scientists were after an "e-mailable" definition—a purely verbal set of instructions that could be sent around the world, and that would allow scientists anywhere to perform an experiment and reproduce the same meter.
Scientists finally achieved this goal in the 1960s, with the noble gas krypton. All noble gases (think of "neon" lights) emit strong, colored light when excited, and krypton happens to emit a real beauty, a sharp beacon of orange light that's easy to measure. So a meter became 1,650,763.73 wavelengths of this orange light from a krypton-86 atom. That's an e-mailable definition, since all krypton atoms are identical, and scientist could just pick up a krypton discharge tube if he needed it. Scientists had finally relegated the platinum-iridium rod to the velvet casket of a museum.
Never satisfied, though, metrologists redefined the meter again in 1983 , getting rid of even the krypton atom. A meter is now the distance light travels in a vacuum in 1/299,792,458th of a second.
Of course, that definition assumes you know how long a second really is...
Tune in tomorrow for the next installment of our exploration of the standards that make science tick. The series is written by Sam Kean, author of The Disappearing Spoon—a collection of funny and peculiar stories hidden throughout the periodic table.
Of course we know how long a second really is. A second is simply the amount of time it takes light in a vacuum to travel 299,792,458 meters.
It love and hate things that are that precise. I hate them because I had to suffer through Analycial Chemistry and there is nothing I hate more than trying to obtain experimental results to the 4th deciaml. However, I have a lot of respect for the people who dedicate their lives to ensuring that the standards remain that way. I was reading about the Kilogram standard not too long ago, and the process that goes into verifying that other standards at still standard (newsflash! the original kilogram which all others are checked against it losing weight!). I can't imagine the work and dedication that goes into ensuring that those are always correct. Keep up the good work standard keepers!, the scientific world depends on it!
wow, so i didn't preview my comment very well. Please excuse the spelling and grammer errors, I'm hurrying out the door to go to lunch. My bad
Cool segment, look forward to the rest of the set
PopSci, thank you from the bottom of my heart to finally do an article on the Metric System. It has been thirty-five years since Congress half-assed tried to move America to start using the measurement system. Then Regan was our president and everything went downhill from there.....
A solar day not used is an energy sourced wasted. So, how much solar energy have you collected and used today?
In time the definitions are becoming technology oriented,
what if a post WWIII scientist wants to measure a meter ,will he have an equipment to measure how much light has traveled 1/299,792,458th of a second, or for that matter 299,792,458 meters for 1 second.
and apart from that all standards are dependent on one or many other standards,like in the above case it was dependent of environmental conditions or time second etc.
why not a definition which does not require technology???? or any other standard to define?? will it not be the best and standard way to go??
For an interesting book on the periodic table,
Eric Scerri, The periodic table, Oxford University Press, 2007.
from Amazon and other usual places
Don't forget about errors in analycial chemistry every single measurement and answer you have to know how wrong you might be to x d.p.
No disrespect but i think these guys need a job. I wonder how much "change" would even occur with temperature because it seem that when we solve sums on these things they turn out to be in millimeters !
It is a pity that the meter is introduced as just an arbitrary length, made more so by simply enumerating its various definitions over time.
It would have been a far better introduction to state the goals of the whole system of measures, namely:
a) A written description should be enough for anyone with suitable instruments to reproduce. This was a breakthrough, the idea that the unit of length would not be the size of a certain object but information (the base of our technological culture) that could be transmitted and reproduced anywhere. Thus, the selection of a 'universal' measure such as the size of the Earth, that could, in principle, be measured everywhere.
Unfortunately, the same technological developments that were one of the main reasons to push for the development of common standards proved that anything that you measured with today's instruments could be measured with more precision with tomorrow's and thus shown inexact.
b) Use of decimal fractions. Gone would be the arbitrary relationships in between successive units, 12 inches to a foot, 3 foot to a yard, 1760 yards to a mile. Conversion in between different units would only need to move the decimal point back and forth.
The British Pound underwent this conversion when they dropped schillings and pennies and went to plain pence, a hundredth of a pound. Thus, decimalization, as the process is called, has no relation to the unit being decimalized, you can decimalize units of lengths, weight or currency, it is an independent goal.
c) Standard names, each unit would have a base name and a series of prefixes would deal with scaling it up and down. The same prefixes would be used in all units, thus kilo is the thousand prefix for length (kilometer) and weight (kilogram).
The standard defined prefixes for each power of ten, which proved to be an overkill. In the end only the prefixes for the powers of a thousand are used, kilo, mega, giga on the higher ranges; mili, micro, nano and so on for the smaller sizes.
Some non-thousands prefixes survive, centimeters are used quite frequently since a unit in the range of an inch is handy. A city block is about a square hectometer or an hectare, 'hecto' being the prefix for a hundred, about two football fields.
d) Practicality: Base units had to be of practical use. The unit of measure had to be about a yard long since units of length of about that size are in use everywhere simply because it is a handy size to have. In this they missed with the unit of weight, the gram. It turned out to be too small. It would have been wiser to use the weight of a kilogram as the base unit, make the gram a thousandth of that base unit and the ton a thousand.
The pursue of these goals was not limited to the units that survive in common use to this day. Angles were to be measured in grads of which you have a hundred to a right angle instead of ninety. If this had prospered, the meter would not seem so arbitrary. A nautical mile is defined as a minute of a degree of latitude, in contrast with the statute mile which is defined by anecdote (a thousand steps of a Roman legion). A kilometer would then turn out to be a centigrad (a centesimal minute, a hundred minutes to a grad) of latitude.
It is unfortunate that the USA remains, along with Myanmar and Liberia, the only countries to stick to an obsolete system of measurement. Articles such as this are not particularly helpful because they fail to explain the rationale behind it and present it as something just as arbitrary as the one in use, so that readers can claim that mine is as good as yours. It fails to explain why it is a system of measures, not just an arbitrary definition of length ... and even in that, it turns out not to be so arbitrary as any sailor can tell, the relationship in between a nautical mile and difference in latitudes and longitudes is quite handy.