New research shows that artificial intelligence (AI) can track the health of coral reefs by analysing the sounds and noises emitted by its constitutive parts.
Exeter University used the technology to monitor the progress of coral restoration.
The series of recordings successfully determined the health status of the reef 92 per cent of the time.
What sound does a coral reef make? The University of Exeter, in the UK, decided to find out for scientific purposes. A team of researchers created an artificial intelligence program that can determine the health of coral reefs by listening to the sounds that they make. These living organisms, made up of colonies of polyps and found on the bed of seas and oceans, produce complex sounds and noises due to the passage of fish and other animals. Analysing these special songs through AI allows researchers to obtain data useful in measuring the health of corals and launch restoration projects when necessary.
Corals’ favourite songs
In the study led by professor Ben Williams, an algorithm was trained using a vast database of sounds from both healthy and degraded coral reefs, allowing the machine to learn the difference. The artificial intelligence was then used to analyse new recordings and it was able to successfully determine the health of coral reefs 92 per cent of the time. “Coral reefs are facing multiple threats including climate change, so monitoring their health and the success of conservation projects is vital,” said Williams.
Until now, one of the greatest difficulties has come from the fact that visual and acoustic analyses of coral reefs were based on highly labour-intensive methods that only specially trained scientists could take part in. “Visual surveys are also limited by the fact that many reef creatures conceal themselves, or are active at night, while the complexity of reef sounds has made it difficult to identify reef health using individual recordings,” Williams explained. The Exeter team chose a technological approach inspired by machine learning to create a programme that could recognise a healthy coral reef’s song. “Our findings show that a computer can pick up patterns that are undetectable to the human ear. It can tell us faster, and more accurately, how the reef is doing.”
The role of coral reefs and how to avoid degradation
The recordings used in the study were made in Indonesia, where some of the local reefs are severely damaged. The study’s authors state that the AI method provides great opportunities to improve the monitoring of these important organisms. “This is a really exciting development. Sound recorders and AI could be used around the world to monitor the health of reefs, and discover whether attempts to protect and restore them are working,” said Dr Tim Lamont, co-author of the study. “In many cases, it’s easier and cheaper to deploy an underwater hydrophone on a reef and leave it there than to have expert divers visiting the reef repeatedly to survey it – especially in remote locations.”
Approximately 25-50 per cent of the world’s coral reefs have been destroyed, and a further 60 per cent are under threat, according to the United Nations Environment Programme. These ecosystems are vital sources of food and protect island nations’ coastlines. 850 million people live within 100km of a coral reef and derive economic benefits from the closest reef. In the future, Williams is certain that the use of artificial intelligence could be extended to other sites across the world to help with restoration projects. “We now want to send recorders out around the world: to the Maldives, to the Great Barrier Reef, to Mexico, to loads of different sites where we’ve got partners who can collect similar data.”