DMR News

Advancing Digital Conversations

Google Invites Public to Help Save Coral Reefs with AI

ByHuey Yee Ong

Jun 7, 2024

Google Invites Public to Help Save Coral Reefs with AI

Google has introduced an innovative AI tool, SurfPerch, designed to enhance our understanding of coral reef ecosystems and their health. Developed in collaboration with Google Research and DeepMind, SurfPerch analyzes thousands of hours of reef audio recordings.

This groundbreaking technology allows marine biologists to “listen” to reef health, monitor nocturnal reef activity, and study reefs even in challenging conditions like deep or murky waters. This capability is expected to significantly bolster conservation efforts aimed at preserving these vital ecosystems.

How Did the Public Help?

The project began with an innovative public engagement initiative through the “Calling in Our Corals” website.

Over the past year, this platform allowed individuals from around the globe to listen to more than 400 hours of audio recordings captured from reefs at various sites worldwide. Participants were prompted to click whenever they detected the sound of fish. This collective effort resulted in the creation of a comprehensive “bioacoustic” dataset that highlights aspects of reef health.

Leveraging this data, Google refined SurfPerch’s capabilities, enabling it to swiftly identify new and previously unrecognized reef sounds.

As detailed in a blog post co-authored by Steve Simpson, a Professor of Marine Biology at the University of Bristol, and Ben Williams, a marine biologist at University College London, SurfPerch’s development marks a significant leap forward.

The AI tool can now process new datasets with a level of efficiency that was previously unattainable. Unlike earlier models that required expensive GPU processors for training, SurfPerch operates without such costly resources, making it more accessible for broader research applications. This efficiency not only saves time but also opens new avenues for studying reef communities and advancing conservation strategies.

What Did We Find?

Intriguingly, researchers discovered that SurfPerch’s performance could be enhanced by incorporating bird recordings. Despite the differences between bird songs and fish sounds, researchers found that there were common patterns the AI could learn from. This cross-domain learning allowed SurfPerch to further refine its ability to detect and analyze reef sounds.

By merging the data from “Calling in Our Corals” with SurfPerch, initial trials revealed notable insights. Researchers distinguished between protected and unprotected reefs in the Philippines, tracked restoration progress in Indonesia, and deepened their understanding of the fish community interactions on the Great Barrier Reef.

The project is far from over. Google continues to collect new audio data, which is regularly added to the “Calling in Our Corals” website. This continuous influx of data promises to further train and refine the AI model, paving the way for more advanced and effective reef conservation strategies, Google states.


Related News:


Featured Image courtesy of cinoby/Getty Images

Huey Yee Ong

Hello, from one tech geek to another. Not your beloved TechCrunch writer, but a writer with an avid interest in the fast-paced tech scenes and all the latest tech mojo. I bring with me a unique take towards tech with a honed applied psychology perspective to make tech news digestible. In other words, I deliver tech news that is easy to read.

Leave a Reply

Your email address will not be published. Required fields are marked *