• Home
  • Specie
  • By identifying bird species by sound, an app opens up new avenues for citizen science

By identifying bird species by sound, an app opens up new avenues for citizen science

By on June 28, 2022 0

The researchers developed the BirdNET app, where people can easily participate in bird research and conservation. Credit: Ashakur Rahaman, Yang Center/Cornell Lab of Ornithology (CC-BY 4.0, creativecommons.org/licenses/by/4.0/)

The BirdNET app, a free machine learning tool that can identify over 3,000 birds by sound alone, generates reliable scientific data and makes it easier for people to contribute to citizen science data on birds by simply recording sounds.

An article published on June 28 in the open access journal PLOS Biology by Connor Wood and colleagues at the K. Lisa Yang Center for Conservation Bioacoustics at Cornell Lab of Ornithology, USA, suggests that the BirdNET app lowers the barrier to citizen science because it does not require bird identification skills. birds to participate. Users simply listen to the birds and tap the app to record. BirdNET uses artificial intelligence to automatically identify species by sound and captures the recording for use in research.

“Our guiding design principles were that we needed an accurate algorithm and a simple user interface,” said study co-author Stefan Kahl, of Cornell Lab’s Yang Center, who led the study. technical development. “Otherwise, users wouldn’t come back to the app.” The results have exceeded expectations: since its launch in 2018, more than 2.2 million people have contributed data.

To test whether the app could generate reliable scientific data, the authors selected four test cases in which conventional research had already provided solid answers. Their results show that data from the BirdNET app successfully reproduced known patterns of song dialects in North American and European songbirds and accurately mapped a bird migration across both continents.

By identifying bird species by sound, the BirdNET app opens up new avenues for citizen science

People can easily participate in bird research and conservation through the newly developed BirdNET app. Credit: Stefan Kahl, Yang Center/Cornell Lab of Ornithology (CC-BY 4.0, creativecommons.org/licenses/by/4.0/)

Validating the reliability of the app’s data for research purposes was the first step in what they hope will be a long-term global research effort, not just for birds, but ultimately for all wildlife and even entire soundscapes. The data used in the four test cases is publicly available, and the authors are working to make the dataset open.

“The most exciting part of this work is how easily people can participate in bird research and conservation,” adds Wood. “You don’t need to know anything about birds, you just need a smartphone, and the BirdNET app can then provide you and the research team with a bird prediction. bird you’ve heard. This has led to huge turnout around the world, resulting in an incredible wealth of data. It’s truly a testament to an enthusiasm for birds that unites people from all walks of life. .

The BirdNET app is part of the Cornell Lab of Ornithology’s suite of tools, including the educational app Merlin Bird ID and the citizen science apps eBird, NestWatch and Project FeederWatch, which together have generated more than a billion sightings, sounds and photos of birds from participants around the world for use in science and conservation.

Bird caller app downloaded a million times worldwide – now available for IOS devices

More information:
BirdNET app, powered by machine learning, reduces barriers to global bird research by enabling citizen participation in science, PLoS Biology (2022). DOI: 10.1371/journal.pbio.3001670

Provided by the Public Library of Science

Quote: Identifying bird species by sound, app opens up new avenues for citizen science (June 28, 2022) Retrieved June 28, 2022 from https://phys.org/news/2022-06-bird-species- app-avenues-citizen. html

This document is subject to copyright. Except for fair use for purposes of private study or research, no part may be reproduced without written permission. The content is provided for information only.