The Unsplittable Bit

James Gleick asks: as scientists crunch and quantize the world, will they ever reach the end?

Share

A countryman came into a telegraph office in Bangor, Maine, with a message, and asked that it be sent immediately. The operator took the message as usual, put his instrument in communication with its destination, ticked off the signals upon the key, and then, according to the rule of the office, hung the message paper on the hook with others that had been previously sent. … The man lounged around some time, evidently unsatisfied. “At last,” says the narrator of the incident, “his patience was exhausted, and he belched out, ‘Ain’t you going to send that dispatch?'” The operator politely informed him that he had sent it. “No, yer ain’t,” replied the indignant man; “there it is now on the hook.”—Harper’s New Monthly Magazine, 1873

A hard lesson to learn was the difference between a message and the paper on which it was written. The telegraph was a great teacher. The information had to be divorced from the physical object. It was abstracted—encoded, first as dots and dashes and then again as electrical impulses, to be sent along wires and, soon, beamed through the ether.

In our sophisticated age, we process it with computers, we store it in “the cloud,” and we carry it about in our portable devices and our very cells. Information is the vital principle of our world.

But what is information? Thanks to the mathematical information theory created by Claude Shannon in 1948, we can measure information in bits. As the fundamental unit of information, Shannon decided, a bit would represent the amount of uncertainty that exists in the flipping of a coin: 1 or 0. Using his tool kit of theorems and algorithms, a mathematician or engineer could quantify not just the number of symbols (words or phonemes or letters or interruptions in an electrical circuit) but also the relative probabilities of each symbol’s occurrence. Information as Shannon defined it, became a measure of surprise—of uncertainty. These are abstractions; a message is no longer tangible or material, like a piece of paper.

Even though information is weightless, transmission has a cost.”The fundamental problem of communication,” he declared, “is that of reproducing at one point either exactly or approximately a message selected at another point.” Simple enough—or so it seemed. “What is significant is the difficulty of transmitting the message from one point to another.” “Point” was a carefully chosen word. The origin and destination of a message could be separated in space or in time; information storage, as in a phonograph record, counts as a communication. Messages are formed from symbols, not from atoms. Those distant points could be the telegraph offices of Baltimore and Washington, or they could be planets lightyears apart, or they could be neurons in a human brain. But even though information is weightless, transmission has a cost.

Warren Weaver, who wrote a companion essay for Shannon’s classic book, The Mathematical Theory of Communication, saw the sweep and grandeur of this abstract view of information, “not only written and oral speech, but also music, the pictorial arts, the theatre, the ballet, and in fact all human behavior.” No wonder information theory quickly began to influence researchers in fields as diverse as genetics and psychology.

One field that seemed to be left out was the most important, the most fundamental of all: physics. In the years after World War II, when physicists enjoyed more prestige than ever before, the great news of science appeared to be the splitting of the atom and the control of nuclear energy. Theorists searched for new subatomic particles and the laws governing their interaction. The business of communications research seemed far removed—a business for electrical engineers. Particle physicists had quarks; they did not seem to need bits.

They do now. One of the first to bring information theory firmly into physics was Rolf Landauer, who escaped Nazi Germany as a boy in 1938, came to New York, served in the Navy as an “electronics technician’s mate,” got his Ph.D. in physics from Harvard University, and went on to spend most of his career as a research leader at IBM. One of his landmark papers bore the title “Information is Physical.” Lest anyone miss the point, he titled a later essay (his last, as it turned out) “Information is Inevitably Physical.” He insisted that bits are not abstract after all, or not merely abstract. He reminded his colleagues again and again that information cannot exist without some physical embodiment, whether a mark on a stone tablet, a hole in a punched card, or a subatomic particle with spin up or down. Information is “therefore”—pause for timpani and trumpets—”tied to the laws of physics and the parts available to us in our real physical universe.”

At IBM, of course, the “parts” of greatest interest were digital electronic computers. Landauer and his colleagues, particularly Charles H. Bennett, were linking information and physics by studying what they called the thermodynamics of computation. This struck some as odd at first, because information processing was mostly treated as disembodied. “The thermodynamics of computation, if anyone had stopped to wonder about it, would probably have seemed no more urgent as a topic of scientific inquiry than, say, the thermodynamics of love,” Bennett said. Or the thermodynamics of thought. How many calories does it cost to have an idea?

Actually, the quantum theorist and mathematician John von Neumann had considered this question in 1949 while working on the giant EDVAC electronic computer (with its 6,000 vacuum tubes and roughly five kilobytes of memory). He made a back-of-the-envelope calculation of the amount of heat that must, he reckoned, be dissipated for every elementary act of information processing—every bitwise computation. In 1961, Landauer tried to justify von Neumann’s formula and discovered that he could not. In fact, it seemed that many logical operations have no energy cost at all. When a bit flips from 0 to 1, or vice versa, energy is conserved, which makes sense, in a way, because the information is preserved, too. Bennett found that the one element of computing that must, indisputably, require dissipation of heat is erasure. When an electronic computer clears a capacitor, it releases energy. For a bit to be lost, heat must be dissipated. It was a physicist’s way of discovering an important modern lesson: Forgetting takes work.

Unlimited memory cannot exist in a finite universe.These physical limits, Landauer argued, have a broader significance. too many people assume that computation has almost divine ability—that at least in theory, powerful enough computers can solve every problem in physics. “We have all been indoctrinated by the mathematicians,” he said, “stating that with enough successive operations, any accuracy requirement can be met.” But unlimited memory cannot exist in a finite universe; in any “bounded volume of space and time,” information, too, must ultimately be limited. The next generation of cosmologists took that limitation seriously. They did the math—estimated the bit count of the cosmos. Seth Lloyd, director of the Center for Extreme Quantum Information Theory at the Massachusetts Institute of Technology, says that by considering the universe as a vast computer, taking into account the reduced Planck constant, the speed of light and the amount of time since the big bang, we can calculate that the universe has performed something on the order of 10120 “ops” in its entire history. Considering “every degree of freedom of every particle,” the universe could now hold something like 1090 bits.

Information still seems a sort of abstraction. bits are binary choices, coin flips, yes/no, 1/0, on/off—insubstantial. How can they be as fundamental to physics as the traditional building blocks of matter and energy? Lloyd puts it this way: “Earth, air, fire, and water in the end are all made of energy, but the different forms they take are determined by information. To do anything requires energy. To specify what is done requires information.”

The late John Archibald Wheeler, visionary relativist, colleague of both Einstein and Bohr, the theorist who gave black holes their name, said the same thing in monosyllables: “it from bit.” That was the title of a famous 1989 paper that reads like a manifesto. “Otherwise put,” he wrote, “every ‘it’—every particle, every field of force, even the spacetime continuum itself, derives its function, its meaning, its very existence . . . [from] bits.” Nature—quantum theorists had learned—comes in irreducible discrete pieces, or quanta. Binary choices are quanta too. This is one way of fathoming the paradox of the observer: that the outcome of an experiment is affected, or even determined, when it is observed. The experiment is not only observing but also asking questions and making statements that must ultimately be expressed in discrete bits. “What we call reality,” Wheeler wrote, “arises in the last analysis from the posing of yes/no questions.”

He left behind a challenge for quantum information science, for physicists and computer scientists both. He urged them to translate physics from the language of the continuum to the language of bits. “If and when we learn how to combine bits in fantastically large numbers to obtain what we call existence, we will know better what we mean both by bit and by existence.”

Why does nature appear quantized? Because information is quantized. The bit is the ultimate unsplittable particle.

James Gleick is the author of six books, including Chaos: Making a New Science and The Information: A History, a Theory, a Flood,_ from which this essay is adapted._

 

Win the Holidays with PopSci's Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.