The most profound technologies are those that disappear

The future is smart, invisible, and requires no effort on our part—from smart assistants to biometric security protocols.
No interaction required
Computers that know you Jim Cooke

Share

JetBlue passengers flying from Boston to Aruba can now present a new kind of boarding pass, one impossible to misplace: their faces. In lieu of handing over a paper ticket or summoning up a smartphone version, beach-bound commuters simply walk up to the gate and pause in front of a camera. After snapping a head shot, the camera relays the image to U.S. Customs and Border Protection. There, biometric software compares it against databases of passport, visa, and immigration images. If the computer finds a match, a screen at the gate flashes a green check mark—the universal “go” sign meaning you’re cleared to drag your wheelie bag and stuffed-animal pillow down the gangway. Biometric boarding probably wasn’t what Xerox PARC chief technologist Mark Weiser had in mind when he coined the term “ubiquitous computing” in 1988. Yet JetBlue’s experiment is a perfect example of what lies ahead—what the late computer pioneer called “the age of calm technology.” Weiser believed that if computers are ever to become truly useful, they need to get out of the way. “The most profound technologies are those that disappear,” he wrote in 1991. “They weave themselves into the fabric of everyday life until they are indistinguishable from it.” We’re in the early days of an era when computing is discreetly all around us, whether it’s facial recognition that speeds us through airports, a cast of virtual assistants that reads us the daily rush-hour report, or software that keeps offensive comments offline—all of which are being made possible by advances in artificial intelligence and machine learning.

While still confined to smart speakers and smartphones, voice assistants like Alexa, Siri, and Cortana do their work off-stage. They are the advance guard of the invisible computing transformation already in our lives. In the span of three years, Alexa has evolved into a smart-home interface, a popular DJ, and the most over-engineered kitchen timer ever built. Google has advanced things too. With the help of AI, its Google Assistant can distinguish your voice from your spouse’s, letting it deliver personalized schedules with commuter traffic reports for each.

Behind these curtains of efficiency are engineers pulling the levers of machine learning. They’re deploying new techniques that train our software to understand, react to, and anticipate our needs. Companies like Alphabet, Facebook, and Amazon aren’t far from creating software that can understand the intent and context of language itself, and can even shield us from our less-than-rational fellow beings.

Take Jigsaw. In February, the Alphabet-owned technology incubator released an experimental program called Perspective. It uses machine-learning algorithms to sniff out abusive or trollish online comments, the kind that pollute digital conversations. After analyzing some 16 million moderated comments posted on The New York Times site, ­Jigsaw built a model that gauges which are likely to be toxic. It works well enough that the Times uses it to ­automatically approve some benign statements.

The growth of voice assistants and other hands-off computing interfaces won’t mean that all of a sudden our keyboards and smartphones (devices that require tactile input) are going to become obsolete. As Chris Harrison, who teaches human-computer interaction at Carnegie Mellon, says, handy objects have a way of lingering. “Everyone likes to think of the future as if it’s going to be this radically different thing—like we’ll all be walking around with AR headsets, and laptops and smartphones will be ­useless,” he says. “I don’t think that’s the case.” Technologies that are useful tend to persist, whether it’s a ­QWERTY keyboard or a flat-screen TV.

But even if we do use our own digits from time to time, today’s computing will eventually give way to Weiser’s intuitive calm. “A good tool,” he once said, “is an invisible tool.”

This was originally published in the November/December 2017 issue of Popular Science. This way for more from our 30th annual Best of What’s New issue.