Woods Hole Oceanographic Institution
SHARE
httpswww.popsci.comsitespopsci.comfilesimport2014kermadec_trench_post1_x1_0.jpg
Nereus is a unique submersible robot, with the ability to be tethered to a surface vessel and piloted directly, or set loose on autonomous missions. It’s powered by the same kind of lithium-ion battery cells found in laptops, and can drill, sample and explore deep sea environments for up to 12 hours at a stretch. Woods Hole Oceanographic Institution

A little more than two years ago, James Cameron plunged 6.8 miles downward, becoming the first explorer to plumb the depths of the Marianas Trench alone.

Correction: It was the first time that a human explored that underwater valley solo. At least three different remotely operated vehicles (ROVs) beat the legendary filmmaker to the punch, traveling more than three times deeper into that trench than his one-man submarine. The most advanced of those machines, Nereus, is on another deep-sea mission right now, part of a 40-day expedition to observe life in the 6.24-mile-deep Kermadec Trench near New Zealand. It’s part of a larger project, called HADES (HADal Ecosystem Studies), which hopes to complete the first systematic survey of the creatures that live in the hadal zone, or depths between 3.7 and 6.8 miles. The mission got underway on April 12th, and should soon begin providing live video streams of what Nereus sees, including potential discoveries of unrecorded organisms.

That last detail is important. If you’re the curious, science-inclined sort, Nereus will act as your underwater surrogate. You won’t have to wait to hear about its triumphs—you can watch them live. And while you’re waiting for the HADES feeds to start, you can tag along remotely with another, separate deep-sea expedition underway in the Gulf of Mexico, where the NOAA research vessel Okeanos Explorer is currently sending its own live-streaming ROV to investigate the seabed. At some point, you might be able to toggle between multiple feeds from two of the most inhospitable environments Earth, and marvel at the weird and undiscovered beasts that live where humans should absolutely fear to tread.

Welcome to the age of remote exploration. This is not to be confused with the age of robotic exploration—machines have been probing unsafe, underwater reaches, as well as even more hazardous environments in space, for decades. Traditionally, those systems have been as stingy with their findings as manned explorers, requiring intermediaries to process or disseminate the results. Whether it was an astronaut coming up with florid, scene-setting descriptions of standing on the moon, or astrophysicists putting Voyager’s signals into a cosmic context, the fruits of all that adventure were inherently disconnected from our own experience. Exploration was a vicarious thrill.

Now, it can be a voyeuristic one. In a happy confluence of technical progress and public outreach, casual viewers can see what robots are seeing.

Sometimes, unfiltered, real-time exploration is a drag. The Okeanos Explorer’s feeds can be the oceanic equivalent of watching C-Span, as nothing but debris and barren seafloor drift past the machine’s high-resolution camera. But for a 10-minute stretch this pat Frida, the ROV stream was riveting.

There was a brief encounter with a snailfish (during which one narrating researcher chided biologist and science team co-lead Stephanie Farrington about her poor Latin pronunciation), another with a cutthroat eel, and then a slow-speed pursuit of a bright red shrimp. Why would anything be that red, at 2176 meters down, where its garish hue would appear jet black to anything not equipped with a floodlight? The researchers debated, disagreed, and were eventually schooled by an off-mic colleague—bright colors aren’t uncommon among deep-ocean creatures, but patterns, which require highly capable eyes and an abundance of light to perceive, are unheard of.

httpswww.popsci.comsitespopsci.comfilesimport2014apr16-hires.jpg
The Deep Discover ROV (piloted from the Okeanos Explorer research vessel) spotted this unidentified jellyfish-like ctenophore on April 16 in the Gulf of Mexico. Image courtesy of NOAA Okeanos Explorer Program

I didn’t witness history in the making, but that’s a lot to expect from a single lunch break. And even with the feed running in the background throughout the afternoon, with the audio occasionally dragging me back to the corresponding browser tab, I learned that cetaceans (or certain species of whales, really) will sometimes rest on the seabed, nestling in and creating distinctive patterns with their flukes. Later, as the camera zoomed in tight on a fluttering, undulating holothurian, or sea cucumber, the science team narrated the creature’s efficient feeding process. The dark maroon invertebrate was apparently expelling waste even while it ate, sucking at unseen nutrients as it clambered across the seafloor. When it suddenly lifted off, the biologist thanked the robot’s pilots for the rare, closeup view. I’d like to take this opportunity to thank them, too, and everyone involved with the Okeanos Explorer’s current mission. This sort of unpackaged, unfiltered remote ride-along, complete with smart people comfortably chattering away and confirming guesses and observations on the fly, that’s what brings field research to life.

It’s possible that the footage of the Marianas Trench collected by James Cameron will be equally thrilling, if not more so. We should find out once his 3D documentary, Deepsea Challenge, comes out (it’s slated for this year, though a release date has yet to be announced). And there’s nothing preventing manned missions, both to the sea and to destinations in space, from having similar telepresence elements. The lion’s share of currently planned exploration, however, does not involve humans sitting in cockpits. While NASA is considering a single manned mission to an asteroid, the agency plans to launch a wide range of robotic probes in the coming decades, including a potential visit to the Jupiter moon Europa. In the ocean, the balance is tipped even more strongly towards robots. According to Brian Midson, NSF’s program director for the Nereus ROV as well as the manned Alvin mini-sub, funding for new deep sea research vehicles is focused almost entirely on unmanned systems.

Probing the Ice, Talking with Light

That’s bad news for those who love hearing about someone else’s heroic flirtation with danger, and great news for the selfish armchair explorers among us. More total robots in the water, along with increasingly advanced models, translates to more opportunities to remotely join the action. Nereus, for example, has been operational for more than six years, but is one of most sophisticated of the current generation of underwater bots. It’s referred to as a hybrid ROV, because of its ability to be directly controlled by human pilots via its fiberoptic tether, or sent on fully autonomous missions. And unlike the Okeanos Explorer’s ROV, which can reach 6000 meters down, the battery-operated Nereus has travelled as deep as 35,000 feet, and has 25 miles worth of lightweight cable to work with, giving it a massive operational radius.

And like its mythological namesake, Nereus has offspring. The bot’s makers at the Woods Hole Oceanographic Institution in Massachusetts are building a series of variants, called Nereids (the daughters, in Greek myth, of the seagoing titan Nereus). The first offshoot, Nereid UI, or Under Ice, is designed to search below polar ice, traveling up to 12 miles away from its control vessel. Its more durable and less breakdown-prone than its predecessor, because of the difficulty of recovering a machine that’s under the ice. “If it drops ballast weight and tries to come back to the surface to wait for a pickup, guess what? It’s trapped forever in a world where there is no rescue,” says Andy Bowen, principal engineer at Woods Hole. Nereid UI’s backup plan is more self-reliant—its design and build tolerances are closer to something traveling to space. If it does fail, the robot will go autonomous, and try to find its way home. Though fully automated submersibles have been used in under-ice missions, they’re limited to what Bowen calls “lawn-mowing,” or broad, low-detail surveys of large areas. Nereid UI would be the first system able to closely inspect and sample specific sections of ice and organisms, a crucial component to understanding the effects of climate change, and the speed of degradation in ice sheets.

Nereid UI is scheduled to deploy in the arctic this summer. Whether it provides telepresence to viewers at home remains to be seen, but the raw capability is there, based on the fiber optics embedded in its tether. And by 2016, another Nereid model should be able to send video back to its control vessel without any physical cables.

The working, unofficial name for that system is Nereid 11K (a reference to its target depth of 11,000 meters), and Bowen’s team at Woods Hole is currently developing it for the Schmidt Ocean Institute, a non-profit research organization founded by Google chairman Eric Schmidt and his wife, Wendy Schmidt. The institute has used Nereus in the past, but now wants its own unmanned explorer, which will also function as a testbed for a potential breakthrough in underwater communications—a optical modem, which transmits data through the water using flashing lights. While traditional radio signals are useless at nearly any depth, and acoustic signals are limited in range and bandwidth, the researchers at Woods Hole have transferred between 1 and 20 megabits per second of data from as far as 150 meters away, by setting a photomultiplier (an extremely sensitive light sensor) to watch for patterns playing across an array of LEDs. By comparison, the average internet transfer speed in the U.S. is 18.2 megabits per second. Nereus 11K could conceivably match that rate, relaying high-quality video through what amounts to photonic flag semaphore.

Though 150 meters might be a paltry distance, at least in the context of deep-sea operations, optimal modems could be chained together using multiple vehicles or underwater buoys, or transitioned away from robotic systems entirely, allowing divers to communicate with one another. Untethered autonomous subs could also share their footage and other data when they’re in optical range, as well as receive updated orders, before heading back to work. And while manned submersibles could theoretically field-test the optical modem—just as they could theoretically be providing live video streams—it’s the robotic Nereus 11K that will actually pioneer the technology. Unmanned explorers are not only racking up more mission hours, and giving us more opportunities to tag along remotely, but also bearing more innovation.

If it isn’t already obvious, I’m not speaking for the deep sea research community, or for Popular Science, in applauding the rise of robotic exploration, and the relative decline of manned missions. Andy Bowen and Brian Midson, who have shepherded the development and deployment of Nereus at Woods Hole and NSF, respectively, are also overseers of deep sea research using human-occupied vehicles, and believers in the value of sending eyeballs underwater. “It’s clear that a human is still by far the best sensor for understanding unknown, unstructured environments,” says Bowen. Midson is more emphatic. “Why bother going to the Grand Canyon, when you can watch it on TV?” he asks (rhetorically).

For the sake of science, it’s important to send intrepid humans on the occasional dive. People can often see what even the highest-resolution cameras can’t, or make sense of complex geological puzzles. But unmanned systems, and the telepresence that they can provide, are redefining exploration, closing the virtual distance between the explorer and observer.

I’m not arguing for a robotic monopoly on adventure, or a species-wide slink into a less vibrant version of the Matrix, as we watch machines do all the fun stuff. But if we’re talking about the pure thrill of discovery, this is a time for celebration, when we can stop relying on descriptions and recounts of someone’ first journey to a lethal environment, and watch it happen for ourselves. The Grand Canyon is obviously more breathtaking in person than on TV. But what if the Grand Canyon sat below miles of water, at lightless, frigid, bone-crushing depths where only a handful of researchers—or an immensely wealthy filmmaker—could ever expect to go? Would you be content to hear about how hard that submersible worked to keep those rarified humans alive during their life-changing experiences? Or would you want a robot to go, and deliver some version of those wonders to your screen?

MORE TO READ