Zero is just 1,500 years old. Before it, there was nothing.

How humans invented zero—and why some tried desperately to do without it.
a close-up of a combination lock, with the numbers 90 and 0 visible
Starting from zero. DepositPhotos

Share

Before there was nothing. Then, there was zero. So ubiquitous is the little oval, imagining life without it seems impossible. It’s the symbolic swim donut keeping math, science, and the internet afloat. But it wasn’t always there.

Accounting is an ancient profession. Sumer, the earliest known Mesopotamian civilization, had a positional numbering system, so there was no need for placeholders. A subsequent empire, Babylon, had different demands, so its number-crunching class used two empty wedges to represent a sum like 507. Across the world, the Mayan civilization came up with its own solution to a similar problem, placing a shell where modern mathematicians might place a 0. Some experts argue these wedges are ground zero, to borrow a phrase, but most academics attribute the invention of zero as a number—not as a warm body but a symbol in its own right, one that can be used in equations—to India.

Sometime around the year 400, Silk Road traders were tabulating the sum total of spices sold or woven rugs purchased with the aid of Bakhshali manuscript. Discovered by a farmer in 1881 in a field in present-day Pakistan, the weathered document looks like a bit of ancient math homework. But scholars at Oxford University believe it’s the earliest known evidence of the invention of zero, represented in the manuscript by a filled-in dot.

The Bakhshali paper, which is written in a swirling Sanskrit, uses the dot as something between a placeholder and a number. “It features a problem to which the answer is zero,” The Guardian noted in 2017, “but here the answer is left blank.” But just as “missing links” help us understand the evolution of new species, historians see the Bakhshali paper as essential evidence in understanding the evolution of a new idea. By 628, when Hindu astronomer Brahmagupta was hard at work, Indian scientists were using the dotted zero as a full-blown figure—and treating it like second nature.

From India, zero spread—like fire, in some circles, and like cursed embers in other. It was most quickly and eagerly embraced in the Arab world. In the 9th century, Mohammed ibn-Musa al-Khowarizmi, a Persian mathematician, used the “sifr” (translation: empty) digit to invent algebra.

What took everyone else so long? Fear. Or at least that’s what many experts believe.

“Some claim the Greeks flirted with the idea [of zero] but, finding the concept of the void too frightening in their Aristotelian framework, passed it on,” mathematician Manil Suri wrote in a 2017 New York Times op-ed. By comparison, Oxford mathematician Marcus Du Sautoy told The Guardian, Indian culture “is quite happy to conceive of the void, to conceive of the infinite.”

Sweeping cultural generalizations aside, there’s something to this claim. When medieval Christians were confronted with zero, they decided it was satanic—in claiming absence, they reasoned, the number turned against God—and banished it from use. Merchants, who could no longer do without something that just a few hundred years ago hadn’t existed, continued to use zero in secret, writing their documents in code.

Of course, the whole of the western world came around eventually. In the 17th century, zero did some heavy lifting for Isaac Newton, when he developed calculus. And the 20th-century adaptation of binary math into binary code owes 0 exactly 50 percent of its success, the other progenitor being 1. It took a thousand years—that’s three zeroes—but this wide-eyed numeral won us all over.