Fish sounds tell us about underwater reefs—but we need better tech to really listen

Soundscape ecology is a non-invasive method for monitoring ecosystem diversity and health, but process behind it is still very time-consuming.
coral reef
Reefs are alive with music, and scientists are trying to decipher the score. Valkyrie Pierce / Unsplash

Share

When a reef fish is born, it’s immediately swept out into the open ocean where life is full of uncertainties. But if it makes it out alive and matures from a larva to a juvenile, the reef always calls it back to continue the breeding cycle. 

There are many ways fish scope out a reef to return to. These underwater communities are very noisy places full of vocal marine animals and invertebrates, and given that sound travels far underwater, small fishes are able to tune in to the traffic reports to judge a reef’s quality. Environments that sound good also tend to attract more new animals. 

Healthy reefs have a rich soundscape. “Lots of the invertebrates and the fishes make weird and wonderful noises for all sorts of weird and wonderful reasons. When you take recordings, you hear all these pops and buzzes and trills and whoops,” says Tim Lamont, a marine biology research fellow at the University of Exeter. A degraded reef, on the other hand, has less life milling about and is therefore much quieter. “If you’re in the business of ecosystem restoration, being able to create coral reefs that sound good is a really good thing to be, ” Lamont explains. 

Lamont and his colleagues have been interested in studying the relationship between sounds and underwater life. But it’s by no means an easy process. Oftentimes, it requires a human touch to clean up the background noise in the recordings, annotate the audio, and mark all the distinct sounds. And while there’s been some attempts to automate this task, many technical limits still remain. 

This became evident in a recent collaboration with Mars Inc., which reached out and asked them to use soundscape ecology to monitor the progress of their reef restorations. (The chocolate and pet food corporation has been collaborating with scientists and local communities to rehabilitate damaged coral habitats around the world as part of its larger effort to offset some of its negative environmental impacts. Mars also partly funded this recent soundscape project.) The corresponding study on the effort, published this week in the Journal of Applied Ecology, found that by the sound of it, Indonesian reefs damaged by blast fishing recovered nicely after restoration efforts. 

A new window into ecosystem health

Coastal communities rely on coral reefs for food and more. When these structures fall apart due to human activities like fishing with dynamite, they can have devastating effects on the people that depend on them for a livelihood. Since these reefs have slow natural recovery rates, restoring the coral, which forms the foundation of these ecosystems, could bring back fish and other marine life. But it’s not always so easy to judge whether habitat restorations will take or not. “It’s a different thing to garden a couple of corals than it is to bring back an entire ecosystem,” says Lamont. Outside of checking whether coral regrows, ecologists have to test if the new reefs can support marine life, dampen wave energy, control carbonate budgets, and provide sustenance for coastal communities. 

That’s where soundscape analyses come in. They’re a promising indicator of overall ecosystem diversity because they can detect more critters than images and visual observation; for instance, biologists can hear fishes that are otherwise hidden or are well-camouflaged. Plus, with sound, experts can monitor the habitats around the clock. “There are different things that you can measure about the soundscape,” Lamont says. “You can measure how complex it is, how loud it is, how variable it is through time, or how variable it is at different pitches through frequency bands.”

[Related: What underwater sounds can tell us about the state of coral reefs]

To collect all the necessary data, the team planted underwater microphones, or hydrophones, all around degraded, healthy, and recovered reefs. They recorded what the reefs sounded like at dawn, dusk, midnight, mid-day, during the full moon, and during the new moon for two years. “We wanted to build up a really good picture,” Lamont says. They found that although the restored reefs “didn’t sound identical to healthy reefs, they sounded really similar” and “very different to the degraded reefs.” The planted corals were filled with marine chatter, telling biologists that many critters approved of the construction. 

Sorting and measuring the sounds

Then came the unglamorous part for Lamont’s team: sitting in the sound room and picking apart the layers of knocks, purrs, croaks, growls, and whoops. It’s akin to untangling the individual instruments from a complex orchestral arrangement. 

“This is a very time-consuming science. With this study, I’ve spent months and months listening meticulously through all of these recordings with my headphones,” says Lamont. “It’s pretty mind-numbing at times.”

Lamont et. al, Journal of Applied Ecology

The team is now trying to automate the process by “getting the computer to do the same job,” Lamont says. But it’s a tough assignment to delegate to machines. Because reef ecosystems are so busy, there’s a lot of background sound that can skew the analysis toward rowdier wildlife. “When you listen to these recordings, you hear a lot of the noise of invertebrates [like] snapping shrimp,” which sounds like crackling static or frying bacon, Lamont says. In fact, these sounds are so loud and prevalent that during World War II, militaries used to hide submarines on coral reefs because the shrimps effectively masked the sound of the submarine. 

Existing computational techniques used to measure soundscapes, also referred to as acoustic indices, are mostly designed for use in terrestrial habitats—like in forests for listening to birds or bats. Still, Lamont sees a lot of similarities between forests and reef communities. For one, different animals seem to be active during certain pockets of time, which can help scientists sort the library of sounds by space, time, and frequency. “We’ve made some attempt to take these indices and apply them underwater,” Lamont says. “But of course, sometimes there are fundamental differences between the types of soundscape you get in different habitats, so they may not work as well.”

[Related: Birders behold: Cornell’s Merlin app is now a one-stop shop for bird identification]

Scientists who study forest soundscapes have come up with creative workarounds for building machine learning algorithms that don’t require setting up recording stations among the trees. Lamont points to a PNAS paper published last July where researchers from the Imperial College London, University of Sydney, and Cornell University used Google’s AudioSet data to train an algorithm to recognize the sound distinctions between individual forests. 

The Google sound compilation, which consisted of a mix of human speech, music, and machine noise, first taught the algorithm to distinguish between various types of noise. Then when the system was applied to forests, it was able to classify the categories of sounds in the forest. The authors wrote that it could one day be used to detect irregular activities like illegal logging and hunting.

Lamont imagines that he could possibly repurpose a similar algorithm to sort through the mountains of ocean sounds he has to analyze. 

The need for cheaper hardware

The other, more classic way to power a useful machine learning algorithm for undersea soundscapes is to collect heaps of data. There are ongoing efforts around the world to build marine soundsets, but the cost of equipment can be a limitation. A high-quality hydrophone with a large memory card, for example, can cost around $3,000.

“They’re often really expensive, which is a bit of an issue if we want more than just well-funded scientists to be able to do this,” says Lamont. In October, he and his colleagues published a study in Ecological Indicators that found that audio from Go-Pro cameras (which cost about $500) were in many cases comparable to the quality of data that they got with a hydrophone. 

[Related: These free-floating robots can monitor the health of our oceans]

“Trialing these lower-cost recordings opens the door for many more people to be involved,” he says. “That would then allow us to collect much more data that would feed into these automated analysis techniques in such a way that we can get useful information [with] low effort.” 

Already, ocean-monitoring technology has come a long way. About a decade ago, hydrophones ran on tape reels and had to be attached to a cable that hung off the side of a boat, where a non-waterproof recording station was housed. Now, they’re completely wireless and can be dropped into a sea bed for weeks or months at a time—until the data needs to be collected. 

“Our ability to take long-term, high-quality recordings underwater is quite a new thing,” Lamont notes. “And that’s partly why we keep discovering all these new things no one’s ever recorded before.” He recalls coming across completely uncanny reef noises as he listened back on the recordings from Indonesia, including the grunt of a soldierfish, the scrape of a parrotfish, and the whoop of a damselfish. But there were some sounds, like a “laugh” that usually occurred at sunrise, that he couldn’t quite place onto a specific species. 

“There’s this fun element of mystery around that,” Lamont says. “There are exciting prospects on the horizon for this field, given how young it is.”