In 1961, Claude Shannon and Edward Thorp built the world’s first wearable computer. The cigarette-pack-size device tracked the speed of a roulette wheel and sent tones via radio to a gambler’s earpiece to help predict where the ball would land. The goal of wearable computers hasn’t changed much since. Like Shannon and Thorp’s system, Google Glass and other head-up displays (HUDs)—from companies including Vuzix and Epson—are intended to heighten a person’s awareness. But the latest discreet HUDs can do much more than augment our reality—they could also help us better ourselves.
One of the newest methods for spurring self-improvement is to turn every task into a game. So-called gamification apps keep score in real-life situations to promote certain behaviors—whether it’s taking out the trash, going for a run, or complimenting someone. Users compete against other players and themselves—and it works. According to internal research, a person who uses the Fitbit health-and-fitness tracker, for instance, takes 43 percent more steps on average than a nonuser and loses an average of 13 pounds.
Gamification has problems; an app doesn’t know when you’re lying.
Yet the systems have problems. Users must be actively involved in the scoring process from start to finish—opt in, sometimes wear a device, and even manually input data into a smartphone app. Those tasks interrupt—even intrude upon—everyday life, turning the entire experience into a painstaking chore. What’s more, an app doesn’t know when you’re lying.
Wearing a computer instead of carrying one could eliminate all those downsides. The Google Glass prototype projects information—photos, e-mails, navigation cues—onto a screen positioned in front of one eye. The system will most likely have its own cellular radio, so it could work independently of a smartphone. Motion sensors, a video camera, and a GPS radio will allow developers to code apps that monitor a person’s behavior in real time. An HUD implementation of the Google Goggles image-recognition software, for example, could keep track of what a person eats, reads, and buys. New apps could establish an instant feedback loop— perhaps a reward for skipping the morning doughnut in favor of a banana.
More-advanced tracking may eventually allow HUDs to predict and prevent bad behavior instead of merely recording it. Google, Apple, and Microsoft have already filed eye-tracking patents, which could be used to monitor what a person is looking at and help the HUD respond with positive and negative cues accordingly. So if the HUD sees a person’s pupils dilate when he’s passing a dive bar on the way to the gym, it could flash a prompt to “keep on walking.”