AI Learns Coral Reef ‘Song’

Summary: A new artificial intelligence algorithm trained using the soundscapes of both healthy and degraded reefs can determine reef health 92% of the time.

Source: UCL

Coral reefs have a complex soundscape – and even experts have to conduct painstaking analysis to measure reef health based on sound recordings.

In the new study, published in Ecological Indicators, scientists trained a computer algorithm using multiple recordings of healthy and degraded reefs, allowing the machine to learn the difference.

The computer then analyzed a host of new recordings, and successfully identified reef health 92% of the time. 

The team used this to track the progress of reef restoration projects.

Lead author, PhD Candidate Ben Williams (UCL Centre for Biodiversity and Environment Research), who started the study while at the University of Exeter, said “Coral reefs are facing multiple threats including climate change, so monitoring their health and the success of conservation projects is vital.

“One major difficulty is that visual and acoustic surveys of reefs usually rely on labor-intensive methods. 

“Visual surveys are also limited by the fact that many reef creatures conceal themselves, or are active at night, while the complexity of reef sounds has made it difficult to identify reef health using individual recordings. 

“Our approach to that problem was to use machine learning – to see whether a computer could learn the song of the reef. 

“Our findings show that a computer can pick up patterns that are undetectable to the human ear. It can tell us faster, and more accurately, how the reef is doing.” 

The fish and other creatures living on coral reefs make a vast range of sounds. 

This shows fish swimming around the coral reef
The fish and other creatures living on coral reefs make a vast range of sounds. Credit: Tim Lamont

The meaning of many of these calls remains unknown, but the new AI method can distinguish between the overall sounds of healthy and unhealthy reefs. 

The recordings used in the study were taken at the Mars Coral Reef Restoration Project, which is restoring heavily damaged reefs in Indonesia.

Co-author Dr Tim Lamont, from Lancaster University, said the AI method creates major opportunities to improve coral reef monitoring. 

“This is a really exciting development. Sound recorders and AI could be used around the world to monitor the health of reefs, and discover whether attempts to protect and restore them are working,” Dr Lamont said.

“In many cases it’s easier and cheaper to deploy an underwater hydrophone on a reef and leave it there than to have expert divers visiting the reef repeatedly to survey it – especially in remote locations.”

Funding: The study was funded by the Natural Environment Research Council and the Swiss National Science Foundation.

About this artificial intelligence research news

Author: Henry Killworth
Source: UCL
Contact: Henry Killworth – UCL
Image: The image is credited to Tim Lamont

Original Research: Open access.
Enhancing automated analysis of marine soundscapes using ecoacoustic indices and machine learning” by Ben Williams et al. Ecological Indicators


Abstract

Enhancing automated analysis of marine soundscapes using ecoacoustic indices and machine learning

Historically, ecological monitoring of marine habitats has primarily relied on labour-intensive, non-automated survey methods. The field of passive acoustic monitoring (PAM) has demonstrated the potential of this practice to automate surveying in marine habitats. This has primarily been through the use of ‘ecoacoustic indices’ to quantify attributes from natural soundscapes.

However, investigations using individual indices have had mixed success.

Using PAM recordings collected at one of the world’s largest coral reef restoration programmes, we instead apply a machine-learning approach across a suite of ecoacoustic indices to improve predictive power of ecosystem health. Healthy and degraded reef sites were identified through live coral cover surveys, with 90–95% and 0–20% cover respectively.

A library of one-minute recordings were extracted from each. Twelve ecoacoustic indices were calculated for each recording, in up to three different frequency bandwidths (low: 0.05–0.8 kHz, medium: 2–7 kHz and broad: 0.05–20 kHz). Twelve of these 33 index-frequency combinations differed significantly between healthy and degraded habitats.

However, the best performing single index could only correctly classify 47% of recordings, requiring extensive sampling from each site to be useful.

We therefore trained a regularised discriminant analysis machine-learning algorithm to discriminate between healthy and degraded sites using an optimised combination of ecoacoustic indices.

This multi-index approach discriminated between these two habitat classes with improved accuracy compared to any single index in isolation. The pooled classification rate of 1000 cross-validated iterations of the model had a 91.7% 0.8, mean SE) success rate at correctly classifying individual recordings.

The model was subsequently used to classify recordings from two actively restored sites, established >24 months prior to recordings, with coral cover values of 79.1% (±3.9) and 66.5% (±3.8). Of these recordings, 37/38 and 33/39 received a classification as healthy respectively.

The model was also used to classify recordings from a newly restored site established <12 months prior with a coral cover of 25.6% (±2.6), from which 27/33 recordings were classified as degraded.

This investigation highlights the value of combining PAM recordings with machine-learning analysis for ecological monitoring and demonstrates the potential of PAM to monitor reef recovery over time, reducing the reliance on labour-intensive in-water surveys by experts.

As access to PAM recorders continues to rapidly advance, effective automated analysis will be needed to keep pace with these expanding acoustic datasets.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.