Pushing the limits of assistive technology during the Boston Marathon
Erich Manser
Erich Manser has run 17 marathons. Legally blind, he describes his vision as “almost like if you look through a drinking straw that you cover with wax paper” due to a condition called retinitis pigmentosa. AT&T

Erich Manser finished his eighth Boston Marathon on Monday, but this race was different than any one before it: It was his first time completing the course with an assistive technology called Aira. Because Manser, 44, is legally blind due to a disease called retinitis pigmentosa, a sighted guide has run alongside him during past marathons. Yesterday, in addition to that guide, he also wore a pair of Google Glasses that sent a live video feed to Jessica Jakeway, who from over 600 miles away in Columbus, Ohio, coached him through the race via a Bluetooth headset.

There were some glitches, and how the race went exposes both the pitfalls and potential of a technology designed to help people who are blind or visually impaired. It was an incredibly intense trial for a system not specifically designed for the grueling and mobbed environment of a marathon, but the rigor of the racing field could help drive the company to improve.

“Coming into today, we were all fully aware that this was going to be an extreme test of the technology,” Manser, an accessibility consultant and tester at IBM in Cambridge, Mass., and a marathoner who has now clocked in 17 races, recounted yesterday in a phone interview after the marathon. “This is not a technology that was developed to have a blind person run the Boston Marathon by themselves.” The race was intended as a way to push forward a platform meant to help visually impaired people do things like navigate a city street, go grocery shopping, or grab an Uber.

Aira was cofounded by Suman Kanuganti, the company’s CEO, who says the service currently has over 200 users and more than 100 who use it daily. Aira allows the visually-impaired people who subscribe to the service to live stream video from smart glasses like Google Glass or smartphone camera; the service patches them into a live agent who shares the person’s viewpoint in real time and helps them with whatever they need.

Manser’s day did not start off smoothly; he couldn’t hear Jakeway’s voice at first. “There were some technical glitches, which we kind of expected coming in,” he said. The problem seemed to stem from his Bluetooth headset. Around four or five miles into the race, he and his physical guide pulled over, turned the headset off and on, and then it started working. It turned out that Jakeway had been able to hear Manser the whole time, but not the other way around.

There was some other technical “quirkiness,” Manser said. A few times, the video feed from Manser to Jakeway stopped working. The two had worked out a verbal shorthand to help in such situations. Jakeway used the term “blackout” to let Manser know she couldn’t see what his Google Glass was seeing—an indication to just get his cues from the sighted guide. And when he couldn’t hear her, Manser used the code “audible” to let her know. Part of the issue was simply the noise of all those people. “These Boston crowds were really tough to compete with for any headset,” he said. (The marathon took him just over five hours, but speed was not the goal of this particular race.)

Erich Manser and David Wei
Erich Manser running with David Wei. AT&T

Kevin Phelan, the vice president of communications for Aira, said in an email via an AT&T spokesperson that they have a “hardcore customer first mentality so our tech team is continually being pushed. For example, the audio issues around the crowd were likely due to a Bluetooth connection of the earphones Erich preferred to use. We’ve built an open platform so if he wants to use them, we’ll encourage him to use them.”

Nevertheless, the day was far from a failure. His guide, David Wei, also with IBM, ran on Manser’s left, but Jakeway was able to use her video feed from the Google Glass (which has a camera on the right) to keep a closer eye on that side. She said things to him like, “runner on the right,” or “water stop coming up,” Manser recalled. She also called out mile markers. “It was valuable information, certainly supplemental,” he said, lightheartedly referring to it as “color commentary.”

Back in Ohio, Jakeway monitored the race from her home. She could see a live video feed, Manser’s ever-changing position on the map, as well as stats like the temperature and the battery status of the Google Glass device and the AT&T Mifi device they used to stream data.

Manser said that he’s excited to help evolve the technology, even if the race didn’t go perfectly on Monday. “The thing I find most exciting about it, is the test that we did today was not geared towards proving that Aira, as it stands right now, is exactly ready for prime time … and ready for a totally blind person to go out and run a marathon with 30,000 people all by themselves,” he said. “But it definitely is a step in that direction, which I find extremely exciting.”