SHARE

Walk down the street, look at the world. This is reality. Now repeat, but wearing an odd-looking, bulky pair of glasses that place into your line of vision selective, relevant bits of data about the world; the data hovers in sight like virtual Post-it Notes, annotating your view. This is augmented reality. Glasses on, you glance to the right, at a vaguely familiar restaurant, and click a small button in your hand. Up pops text reminding you that Tom’s Restaurant was the model for the diner on “Seinfeld”; not only that, but — according to the glasses, at least — the Morningside salad is worth ordering.

When the technology for augmented reality (AR) is fully developed, the gear won’t amount to much more than glasses and some sort of small unit like a PDA. Right now, though, it consists of about 26 pounds of equipment that gets strapped to the back and to
the head, along with a shoulder-perching flying saucer-shaped antenna. The Mobile Augmented Reality System (MARS), developed at Columbia University (not far from Tom’s Restaurant), has been assembled from off-the-shelf technology, including a 1GHz Dell laptop with a graphics accelerator chip and soap-bar-sized batteries to power the display glasses and the critical positioning and orientation technologies. Strap on this rig and you look like a robothief on the lam from CompUSA.

But if you do strap on this rig, as I have, you begin to understand the profound possibilities of an AR system, which can superimpose computer-generated text, graphics, 3-D animation, sound, or any other digitized data on the real world. Think of what digital detail can accomplish when it pops up at your beck and call, to identify faces, or buildings, or the parts of an engine being repaired, or the flight number of a plane in the air, or the schedule of a train in a station.

Already, AR is providing real-time battlefield data for soldiers and giving physicians the equivalent of X-ray vision during delicate operations. Data is power, and AR promises to be a powerful way to insert data into the seen world.

Much of this will have to wait until later in this decade: The MARS system I wore, the first to take AR outdoors, cannot be comfortably used for much more than a few minutes at a time, even if you don’t mind the gawking of passersby. And the coordination between the wearer and the data-display system needs to be better synchronized. But the principles of AR are well demonstrated, and better-working technology is on the way.

To begin, an AR system needs to know two things precisely: where you are located, and where you are looking. In order to accurately superimpose data in the field of view, the MARS system relies on two separate inputs: data from the differential Global Positioning System (GPS), which helps determine within centimeters the spot where you’re standing, and data from equipment that calculates the direction of vision, down to a few degrees’ accuracy. For positioning, the system triangulates signals from several GPS satellites overhead — hence the flying-saucer antenna — and a GPS transmitter on Columbia’s engineering building. For orientation, an inertial/magnetic tracker rides on a headband above the AR glasses. This device is a combination of miniature gyroscopes and accelerometers that detect head movements along with an electronic compass that establishes the direction of the viewer’s gaze in relation to Earth’s magnetic field.

What’s the critical factor here? “Registration, registration, registration,” say AR researchers, echoing the old real estate mantra. The challenge is to accurately and continuously determine the line of sight and then align the graphics to it. Getting around the registration roadblock is less of a problem indoors, where tiny video cameras in a head-worn tracker can relatively easily read orientation and positioning bar codes or flashing infrared markers placed on a ceiling. Outdoors, however, the situation gets much more dicey. Because the tracking system currently used is sensitive to sudden variations in magnetic fields, the alignment of graphics and a street scene can be easily thrown off by even a stray remnant of 19th century technology like old iron trolley car tracks beneath asphalt. Ultimately, resolving registration difficulties may require the addition of computer vision analysis systems, with sophisticated software that can recognize the video outlines of rooms or buildings and match them to stored 3-D computer models of the real world.

Still, even if this is overcome, there will have to be a leap forward in wearable computer technology as well. During the past few years, more convenient brick-size, wearable PCs have been marketed by a number of small firms. The most prominent is Xybernaut Corp., which is selling the U.S. version of Hitachi’s Wearable Internet Appliance, known as the Poma — the first wearable computer to be sold to businesses and consumers through office supply stores and electronics retailers. Essentially a Pocket PC with a color head-worn (single-eye) display, it bears the slim profile of what researchers envision will characterize an AR appliance of the future. But despite its tricked-out design, this device is only as powerful as a typical PDA and far too limited for stereoscopic 3-D position-sensitive AR. The top AR researchers — Steven Feiner, the developer of MARS at Columbia, and his counterparts at the University of North Carolina, Georgia Tech, and the University of Washington, along with researchers at companies such as Sony and Siemens — estimate that it will take at least two more years before an AR-capable wearable computer will be developed.

In the Workplace

The term “augmented reality” was coined at Boeing in 1990 by researcher Tom Caudell. He and a colleague, David Mizell, were asked to come up with an alternative to the expensive diagrams and marking devices then used to guide workers on the factory floor. They proposed replacing the large plywood boards, which contained individually designed wiring instructions for each plane, with a head-mounted apparatus that would display a plane’s specific schematics through high-tech eyeware and project them onto multipurpose, reusable boards. Instead of reconfiguring each plywood board manually in each step of the manufacturing process, the customized wiring instructions would essentially be worn by the worker and altered quickly and efficiently through a computer system.

Soon after he suggested this plan, Caudell realized that he and Mizell were amplifying the breadth of information in the factory worker’s line of sight. “We were coming out of a meeting about the wire-bundling boards, and on the way to the bathroom I realized we were augmenting the user’s reality,” Caudell says. “We first went public with augmented reality in a paper published in 1992.”

Although Boeing higher-ups agreed to experiment with the new system, Caudell left the company soon after to pursue his interest in computer visualizations of complex scientific problems. Mizell continued developing several iterations of the wire-bundling AR system, including one that Boeing employees particularly liked in which the display lenses were attached to a headband with a hinge.

But when Boeing didn’t adopt the test system on the factory floor, Mizell left the company as well. Despite Boeing’s reluctance to install early AR systems, the company is still experimenting with the technology; along with IBM, it is among only a few major U.S. firms to do so.

On the Battlefield

For decades the military has been providing pilots, tank operators, and other fighters with advanced vision systems that overlay real-time combat information on computer-generated analytical data. Extending these capabilities to the fighter on the ground, however, is proving to be a much harder problem for equipment designers. Since 1992, the Defense Advanced Research Projects Agency has supported research on head-mounted displays and other AR-enabling technologies. And seven years ago, the U.S. Army launched the Land Warrior Program, which hopes to develop wearable computers as standard equipment. After significant delays in the program, Land Warrior brass now expect to field-test G.I.-wearable computer systems by 2003 and to equip all soldiers by 2008.

Almost from its beginnings, Feiner’s AR work at Columbia has been funded by grants from the Office of Naval Research (ONR). An outdoor position- and orientation-sensing AR system like MARS, shrunk down, could be a boon for future Marines in combat. With its own funding from ONR, a group of engineers at the Naval Research Laboratory in Washington, D.C., is leading an effort to replicate and advance Feiner’s work in a program known as the Battlefield Augmented Reality System (BARS).

“The war fighter of the future will have to work in an environment where there may be no
signage, and enemy forces are all around,” says Lawrence Rosenblum, director of the Virtual Reality Lab at the Naval Research Laboratory (think of recent shots from bombed-out Afghan cities, and you get the picture). “Using augmented reality to empower dismounted war fighters and to coordinate information between them and their command centers could be crucial for survival.”

In the AR future, a small team of soldiers airlifted into a remote combat area will encounter terrain that has been mapped in advance. Soldiers won’t see just rocks, trees, and buildings, they’ll see annotated warnings: “buried mines” or “enemy stores arms in this building.” As surveillance reports flow into the command center, new graphics will be broadcast to the AR gear.

A maneuver sketched with a stylus 1,000 miles away on a commander’s input tablet would appear in each soldier’s view of the war zone, adjusted for position.

At the Hospital

The first clinical medical experiment with augmented reality is being conducted at the University of North Carolina at Chapel Hill. Patients admitted for routine breast biopsies and possible lumpectomies are randomly assigned to the AR test. Instead of the radiologist’s usual practice of looking up at a sonogram screen and then back again at the patient, ultrasound images are seen through the physician’s headgear as projected directly onto the patient’s body. This provides a sort of virtual X-ray vision throughout the procedure. Breast lumps and other possibly cancerous anomalies show up as ghostly white outlines against an uneven gray background. And the position- and orientation-sensing technology in the head-mounted display lets the radiologist “see” where to guide a biopsy needle with unprecedented precision. The hoped-for outcome of this AR application includes fewer complications and shorter recovery times for existing procedures, as well as the development of new surgical techniques.

For brief procedures such as biopsies and laparoscopic (minimally invasive) surgery, a head-mounted AR display offers an ideal solution for combining actual and computer worlds. But for longer operations or spot checks of lengthy procedures, a head-mounted device may prove less desirable than handheld display equipment or images that are projected directly onto the real world. “Head-mounted displays in their present state have serious limitations,” says George Stetten, an assistant professor of bioengineering at the University of Pittsburgh. “Your field of view is limited, the resolution is not as good as natural vision, and it can be cumbersome in the operating room.”

Stetten has developed a handheld ultrasound transducer that casts an image directly through the part of the patient’s body being examined, like a black-and-white beam from an X-ray flashlight. In this system, the overlaying of computer-generated sonograms on flesh and bones is all done with mirrors.

In Everyday Life

There is no shortage of wishlist applications for personal AR, whether handheld or head-mounted. Consider the home garage of the future, for instance. While fixing your car, there will no longer be the need to pull your head in and out from under the open hood to consult a bulky, greasy manual. With AR, you’ll simply slip on a tiny visor and guided repair instructions will appear next to each under-the-hood part that you gaze at: “Now that you’ve disconnected the radiator hose, move it to one side and unscrew the carburetor cap.” Or you can retrieve the same data and navigate through parts information and replacement sales sites on the Web by merely holding a PDA-size position-sensing screen in front of any section of the engine.

And when AR headgear does shrink down to the size of common glasses, it could be a must for up-and-coming managers, to avoid career or social gaffes at business meetings and cocktail parties. Everyone will be packing extra data in their spectacles. Each time you look at someone across a conference table or a crowded room, information about who they are and what their background is could appear before your eyes. Learning how not to make it obvious that you are “scanning” a person’s data will be a new business skill, like trying to look natural in front of a teleprompter. But that’s just the beginning. What if you could tap into the other person’s database to learn what they’re seeing about you? Or what if computer hackers could download misinformation into AR systems during crucial meetings? AR will no doubt meet industrial espionage as it finds a corporate home.

The social implications of living with AR are already being explored firsthand by so-called cyborgs — a few dozen people in North America who spend all their waking hours equipped with wearable computers. Some have been doing so since the mid-1990s, when the earliest borgs discovered one another as students at MIT.

Thad Starner, now an associate professor at Georgia Tech, was one of the first of these man/machine hybrids. He uses his self-designed wearable computer for assisted memory. In a meeting or conversation, Starner can tap the compact wired keyboard he always has in one hand to input requests for information from a hard drive or a wireless Internet connection hidden away in his network-wired vest, and see it displayed in a tiny square mounted in his prescription-lens glasses. Starner adopts pensive looks and purposeful shrugs to cue others that he’s accessing information and not just being inattentive.

If Starner uses AR to load his world with information, Steve Mann, another of the original MIT borgs and now a computer scientist at the University of Toronto, has a program that does the opposite. Although Mann’s wearable computer system provides reams of data when he asks for it, it can also block the world out with what Mann calls “diminished reality.” This AR software can replace billboards, street signs, and ads on buses with stored digital images of waterfalls and other natural scenes.

Researchers in augmented reality are realistic about its timeline. After I tried on the MARS gear, Columbia’s Feiner admitted it was too cumbersome for anything beyond a short test run. For now, the awkward display glasses, the constant need to focus crosshairs in a field of vision on a target or landmark, and the almost certain drift in registration make most of the augmentation an annoyance, not an asset. Still, without hesitation, Feiner asserts that it’s only a matter of time before augmented reality becomes part of our daily lives. Judging from the cellphones and palm-sized organizers that are already pervading our pockets, he may well be right when he predicts: “You’ll feel left out if you don’t have a wearable computer to enhance your experience of the world.”

Illustration by Stephen Rountree

Illustration by Stephen Rountree

Tactical maneuvers circa 1805:
(Raking the Line) Fighting ships moved in single file. By approaching at an angle (1), a ship could dodge the brunt of the enemy blow, then fire as it passed. After delivering a hit to the leader’s stern (2), it closed with a crippling broadside strike (3).
Illustration by Stephen Rountree

Illustration by Stephen Rountree

Tactical maneuvers today:
(Into the Wind) To deploy aircraft effectively along one of four takeoff trajectories, a carrier must face into the wind — the box pattern maximizes this exposure. The S-3B Viking support aircraft circles overhead, ready to refuel jets returning from missions.