Artificial intelligence tool learns “song of the reef” to determine ecosystem health

Coral reefs are among Earth’s most stunning and biodiverse ecosystems. Yet, due to human-induced climate change resulting in warmer oceans, we are seeing growing numbers of these living habitats dying.

The urgency of the crisis facing coral reefs around the world was highlighted in a recent study that showed that 91% of Australia’s Great Barrier Reef had experienced coral bleaching in the summer of 2021–22 due to heat stress from rising water temperatures.

Determining reef health is key to gauging the extent of the problem and developing ways of intervening to save these ecosystems, and a new artificial intelligence (AI) tool has been developed to measure reef health using… sound.

Research coming out of the UK is using AI to study the soundscape of Indonesian reefs to determine the health of the ecosystems. The results, published in Ecological Indicators, shows that the AI tool could learn the “song of the reef” and determine reef health with 92% accuracy.

The findings are being used to track the progress of reef restoration.

More on artificial intelligence: Are machine-learning tools the future of healthcare?

“Coral reefs are facing multiple threats, including climate change, so monitoring their health and the success of conservation projects is vital,” says lead author Ben Williams of the UK’s University of Exeter.

“One major difficulty is that visual and acoustic surveys of reefs usually rely on labour-intensive methods. Visual surveys are also limited by the fact that many reef creatures conceal themselves, or are active at night, while the complexity of reef sounds has made it difficult to identify reef health using individual recordings.

“Our approach to that problem was to use machine learning – to see whether a computer could learn the song of the reef. Our findings show that a computer can pick up patterns that are undetectable to the human ear. It can tell us faster, and more accurately, how the reef is doing.”

Fish and other creatures make a variety of sounds in coral reefs. While the meaning of many of these calls remains a mystery, the new machine-learning algorithm can distinguish overall between healthy and unhealthy reefs.

Recordings used in the study were taken at the Mars Coral Reef Restoration Project, which is restoring heavily damaged reefs in Indonesia.

The study’s co-author Dr Tim Lamont, a marine biologist at Lancaster University, said the AI method provides advantages in monitoring coral reefs.

“This is a really exciting development,” says Lamont. “Sound recorders and AI could be used around the world to monitor the health of reefs, and discover whether attempts to protect and restore them are working.

“In many cases it’s easier and cheaper to deploy an underwater hydrophone on a reef and leave it there than to have expert divers visiting the reef repeatedly to survey it, especially in remote locations.”

Please login to favourite this article.