This content is not available in your region

AI can track the health of coral reefs through their ‘song’, but what does it sound like?

A hydrophone recording the sounds of a coral reef in Sulawesi, Indonesia.
A hydrophone recording the sounds of a coral reef in Sulawesi, Indonesia.   -   Copyright  Tim Lamont, University of Exeter
By Nicole Chang

Scientists in the UK have trained an artificial intelligence (AI) system to track the health of coral reefs - all through the power of “song”.

Coral reef soundscapes are complex and diverse, with fish and other creatures contributing to a wide variety of noises that can serve as a way to monitor how healthy a particular reef is.

However the process of analysing these soundscapes can often be laborious and time-consuming, and this is where AI can make a difference.

As part of a new study, researchers from the University of Exeter exposed a computer algorithm to recordings of both healthy and degraded reefs, training the machine to differentiate between them.

The system then analysed new recordings, and managed to correctly identify reef health 92 per cent of the time, the team said.

You might not think it just by looking at them, but coral reefs are actually “really noisy places”, Ben Williams, the study’s lead author, told Euronews Next.

On a thriving reef, you can hear snapping shrimp that sound like “the crackling of a campfire in the background”, he said.

There's all these kinds of noises from the different fish, which could be like whoops, grunts and knocks, all kinds of things you wouldn't expect
Ben Williams
Lead study author, University of Exeter

“And then intermittently there's all these kinds of noises from the different fish, which could be like whoops, grunts and knocks, all kinds of things you wouldn't expect to come from a fish”.

However on a degraded reef, the soundscape can be “much more desolate”, Williams said. 

The “added complexity” of fish sounds - fish communicating, feeding, defending themselves and so on - is very often missing.

Using AI to save coral reefs

Tracking the health of a coral reef through its soundscapes is an easy way to learn about the state of its habitat, without having to use visual methods such as sending down expert divers.

“We can just drop a hydrophone in the water, leave it for weeks or months, and we get this really easy to collect long-term dataset,” said Williams.

Analysing all this data is another matter.

“We have to listen to these and just count recordings of fish that we hear, which takes ages and it's really tricky,” he said.

But there too, AI can help automate the process - allowing recordings to be analysed much faster, and much more accurately, he explained.

“So it's a double win in that regard”.

Such technology could hopefully contribute to the fight to preserve the world’s remaining coral reefs, which are vital indicators of environmental change, and are also particularly vulnerable to such change.

About 25 to 50 per cent of the world's coral reefs have been destroyed, and another 60 per cent are under threat, according to the United Nations Environment Programme (UNEP).

These reefs are vital sources of food and income, and also protect the shorelines of low-lying island nations.

Around 850 million people live within 100 km of a coral reef and derive some economic benefit from their ecosystem services, according to UNEP.

The recordings used in the University of Exeter study were taken at the Mars Coral Reef Restoration Project, which restores heavily damaged reefs in Indonesia.

In the future, Williams says the team’s work could be extended to sites all around the globe to aid in other restoration projects.

“We now want to send recorders out around the world: To the Maldives, to the Great Barrier Reef, to Mexico, to loads of different sites where we've got partners who can collect similar data”.