A countryman came into a telegraph office in Bangor, Maine, with a message, and asked that it be sent immediately. The operator took the message as usual, put his instrument in communication with its destination, ticked off the signals upon the key, and then, according to the rule of the office, hung the message paper on the hook with others that had been previously sent. ... The man lounged around some time, evidently unsatisfied. “At last,” says the narrator of the incident, “his patience was exhausted, and he belched out, ‘Ain’t you going to send that dispatch?’” The operator politely informed him that he had sent it. “No, yer ain’t,” replied the indignant man; “there it is now on the hook.”—Harper’s New Monthly Magazine, 1873
A hard lesson to learn was the difference between a message and the paper on which it was written. The telegraph was a great teacher. The information had to be divorced from the physical object. It was abstracted—encoded, first as dots and dashes and then again as electrical impulses, to be sent along wires and, soon, beamed through the ether. In our sophisticated age, we process it with computers, we store it in “the cloud,” and we carry it about in our portable devices and our very cells. Information is the vital principle of our world.
But what is information? Thanks to the mathematical information theory created by Claude Shannon in 1948, we can measure information in bits. As the fundamental unit of information, Shannon decided, a bit would represent the amount of uncertainty that exists in the flipping of a coin: 1 or 0. Using his tool kit of theorems and algorithms, a mathematician or engineer could quantify not just the number of symbols (words or phonemes or letters or interruptions in an electrical circuit) but also the relative probabilities of each symbol’s occurrence. Information as Shannon defined it, became a measure of surprise—of uncertainty. These are abstractions; a message is no longer tangible or material, like a piece of paper.
“The fundamental problem of communication,” he declared, “is that of reproducing at one point either exactly or approximately a message selected at another point.” Simple enough—or so it seemed. “What is significant is the difficulty of transmitting the message from one point to another.” “Point” was a carefully chosen word. The origin and destination of a message could be separated in space or in time; information storage, as in a phonograph record, counts as a communication. Messages are formed from symbols, not from atoms. Those distant points could be the telegraph offices of Baltimore and Washington, or they could be planets lightyears apart, or they could be neurons in a human brain. But even though information is weightless, transmission has a cost.
Warren Weaver, who wrote a companion essay for Shannon’s classic book, The Mathematical Theory of Communication, saw the sweep and grandeur of this abstract view of information, “not only written and oral speech, but also music, the pictorial arts, the theatre, the ballet, and in fact all human behavior.” No wonder information theory quickly began to influence researchers in fields as diverse as genetics and psychology.
One field that seemed to be left out was the most important, the most fundamental of all: physics. In the years after World War II, when physicists enjoyed more prestige than ever before, the great news of science appeared to be the splitting of the atom and the control of nuclear energy. Theorists searched for new subatomic particles and the laws governing their interaction. The business of communications research seemed far removed—a business for electrical engineers. Particle physicists had quarks; they did not seem to need bits.
They do now. One of the first to bring information theory firmly into physics was Rolf Landauer, who escaped Nazi Germany as a boy in 1938, came to New York, served in the Navy as an “electronics technician’s mate,” got his Ph.D. in physics from Harvard University, and went on to spend most of his career as a research leader at IBM. One of his landmark papers bore the title “Information is Physical.” Lest anyone miss the point, he titled a later essay (his last, as it turned out) “Information is Inevitably Physical.” He insisted that bits are not abstract after all, or not merely abstract. He reminded his colleagues again and again that information cannot exist without some physical embodiment, whether a mark on a stone tablet, a hole in a punched card, or a subatomic particle with spin up or down. Information is “therefore”—pause for timpani and trumpets—“tied to the laws of physics and the parts available to us in our real physical universe.”single page
Five amazing, clean technologies that will set us free, in this month's energy-focused issue. Also: how to build a better bomb detector, the robotic toys that are raising your children, a human catapult, the world's smallest arcade, and much more.