Publikation:

ANIMAL-SPOT enables animal-independent signal detection and classification using deep learning

Lade...
Vorschaubild

Dateien

Bergler_2-8evz4j1afnu70.pdf
Bergler_2-8evz4j1afnu70.pdfGröße: 3.15 MBDownloads: 31

Datum

2022

Autor:innen

Bergler, Christian
Tyndel, Stephen A.
Barnhill, Alexander
Ortiz, Sara T.
Kalan, Ammie K.
Cheng, Rachael Xi
Brinkløv, Signe
Osiecka, Anna N.
Klump, Barbara C.
et al.

Herausgeber:innen

Kontakt

ISSN der Zeitschrift

Electronic ISSN

ISBN

Bibliografische Daten

Verlag

Schriftenreihe

Auflagebezeichnung

ArXiv-ID

Internationale Patentnummer

Link zur Lizenz

Angaben zur Forschungsförderung

Projekt

Open Access-Veröffentlichung
Open Access Gold
Core Facility der Universität Konstanz

Gesperrt bis

Titel in einer weiteren Sprache

Publikationstyp
Zeitschriftenartikel
Publikationsstatus
Published

Erschienen in

Scientific Reports. Springer. 2022, 12, 21966. eISSN 2045-2322. Available under: doi: 10.1038/s41598-022-26429-y

Zusammenfassung

Bioacoustic research spans a wide range of biological questions and applications, relying on identification of target species or smaller acoustic units, such as distinct call types. However, manually identifying the signal of interest is time-intensive, error-prone, and becomes unfeasible with large data volumes. Therefore, machine-driven algorithms are increasingly applied to various bioacoustic signal identification challenges. Nevertheless, biologists still have major difficulties trying to transfer existing animal- and/or scenario-related machine learning approaches to their specific animal datasets and scientific questions. This study presents an animal-independent, open-source deep learning framework, along with a detailed user guide. Three signal identification tasks, commonly encountered in bioacoustics research, were investigated: (1) target signal vs. background noise detection, (2) species classification, and (3) call type categorization. ANIMAL-SPOT successfully segmented human-annotated target signals in data volumes representing 10 distinct animal species and 1 additional genus, resulting in a mean test accuracy of 97.9%, together with an average area under the ROC curve (AUC) of 95.9%, when predicting on unseen recordings. Moreover, an average segmentation accuracy and F1-score of 95.4% was achieved on the publicly available BirdVox-Full-Night data corpus. In addition, multi-class species and call type classification resulted in 96.6% and 92.7% accuracy on unseen test data, as well as 95.2% and 88.4% regarding previous animal-specific machine-based detection excerpts. Furthermore, an Unweighted Average Recall (UAR) of 89.3% outperformed the multi-species classification baseline system of the ComParE 2021 Primate Sub-Challenge. Besides animal independence, ANIMAL-SPOT does not rely on expert knowledge or special computing resources, thereby making deep-learning-based bioacoustic signal identification accessible to a broad audience.

Zusammenfassung in einer weiteren Sprache

Fachgebiet (DDC)
570 Biowissenschaften, Biologie

Schlagwörter

Konferenz

Rezension
undefined / . - undefined, undefined

Forschungsvorhaben

Organisationseinheiten

Zeitschriftenheft

Verknüpfte Datensätze

Zitieren

ISO 690BERGLER, Christian, Simeon Q. SMEELE, Stephen A. TYNDEL, Alexander BARNHILL, Sara T. ORTIZ, Ammie K. KALAN, Rachael Xi CHENG, Signe BRINKLØV, Anna N. OSIECKA, Barbara C. KLUMP, 2022. ANIMAL-SPOT enables animal-independent signal detection and classification using deep learning. In: Scientific Reports. Springer. 2022, 12, 21966. eISSN 2045-2322. Available under: doi: 10.1038/s41598-022-26429-y
BibTex
@article{Bergler2022ANIMA-68971,
  year={2022},
  doi={10.1038/s41598-022-26429-y},
  title={ANIMAL-SPOT enables animal-independent signal detection and classification using deep learning},
  volume={12},
  journal={Scientific Reports},
  author={Bergler, Christian and Smeele, Simeon Q. and Tyndel, Stephen A. and Barnhill, Alexander and Ortiz, Sara T. and Kalan, Ammie K. and Cheng, Rachael Xi and Brinkløv, Signe and Osiecka, Anna N. and Klump, Barbara C.},
  note={Article Number: 21966}
}
RDF
<rdf:RDF
    xmlns:dcterms="http://purl.org/dc/terms/"
    xmlns:dc="http://purl.org/dc/elements/1.1/"
    xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
    xmlns:bibo="http://purl.org/ontology/bibo/"
    xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#"
    xmlns:foaf="http://xmlns.com/foaf/0.1/"
    xmlns:void="http://rdfs.org/ns/void#"
    xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > 
  <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/68971">
    <dc:contributor>Tyndel, Stephen A.</dc:contributor>
    <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/28"/>
    <dc:contributor>Brinkløv, Signe</dc:contributor>
    <dc:language>eng</dc:language>
    <dc:creator>Brinkløv, Signe</dc:creator>
    <dc:creator>Cheng, Rachael Xi</dc:creator>
    <dcterms:title>ANIMAL-SPOT enables animal-independent signal detection and classification using deep learning</dcterms:title>
    <dc:contributor>Bergler, Christian</dc:contributor>
    <foaf:homepage rdf:resource="http://localhost:8080/"/>
    <dc:creator>Klump, Barbara C.</dc:creator>
    <dc:contributor>Cheng, Rachael Xi</dc:contributor>
    <dc:creator>Barnhill, Alexander</dc:creator>
    <dc:contributor>Ortiz, Sara T.</dc:contributor>
    <dc:creator>Tyndel, Stephen A.</dc:creator>
    <dc:contributor>Kalan, Ammie K.</dc:contributor>
    <dc:contributor>Osiecka, Anna N.</dc:contributor>
    <bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/68971"/>
    <dc:rights>Attribution 4.0 International</dc:rights>
    <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2024-01-09T08:24:02Z</dc:date>
    <dspace:hasBitstream rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/68971/1/Bergler_2-8evz4j1afnu70.pdf"/>
    <dc:creator>Bergler, Christian</dc:creator>
    <dcterms:issued>2022</dcterms:issued>
    <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2024-01-09T08:24:02Z</dcterms:available>
    <dc:contributor>Klump, Barbara C.</dc:contributor>
    <dc:creator>Osiecka, Anna N.</dc:creator>
    <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/28"/>
    <dc:contributor>Barnhill, Alexander</dc:contributor>
    <dc:creator>Kalan, Ammie K.</dc:creator>
    <dc:creator>Ortiz, Sara T.</dc:creator>
    <dc:creator>Smeele, Simeon Q.</dc:creator>
    <dcterms:abstract>Bioacoustic research spans a wide range of biological questions and applications, relying on identification of target species or smaller acoustic units, such as distinct call types. However, manually identifying the signal of interest is time-intensive, error-prone, and becomes unfeasible with large data volumes. Therefore, machine-driven algorithms are increasingly applied to various bioacoustic signal identification challenges. Nevertheless, biologists still have major difficulties trying to transfer existing animal- and/or scenario-related machine learning approaches to their specific animal datasets and scientific questions. This study presents an animal-independent, open-source deep learning framework, along with a detailed user guide. Three signal identification tasks, commonly encountered in bioacoustics research, were investigated: (1) target signal vs. background noise detection, (2) species classification, and (3) call type categorization. ANIMAL-SPOT successfully segmented human-annotated target signals in data volumes representing 10 distinct animal species and 1 additional genus, resulting in a mean test accuracy of 97.9%, together with an average area under the ROC curve (AUC) of 95.9%, when predicting on unseen recordings. Moreover, an average segmentation accuracy and F1-score of 95.4% was achieved on the publicly available BirdVox-Full-Night data corpus. In addition, multi-class species and call type classification resulted in 96.6% and 92.7% accuracy on unseen test data, as well as 95.2% and 88.4% regarding previous animal-specific machine-based detection excerpts. Furthermore, an Unweighted Average Recall (UAR) of 89.3% outperformed the multi-species classification baseline system of the ComParE 2021 Primate Sub-Challenge. Besides animal independence, ANIMAL-SPOT does not rely on expert knowledge or special computing resources, thereby making deep-learning-based bioacoustic signal identification accessible to a broad audience.</dcterms:abstract>
    <dc:contributor>Smeele, Simeon Q.</dc:contributor>
    <dcterms:rights rdf:resource="http://creativecommons.org/licenses/by/4.0/"/>
    <dcterms:hasPart rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/68971/1/Bergler_2-8evz4j1afnu70.pdf"/>
    <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/>
  </rdf:Description>
</rdf:RDF>

Interner Vermerk

xmlui.Submission.submit.DescribeStep.inputForms.label.kops_note_fromSubmitter

Kontakt
URL der Originalveröffentl.

Prüfdatum der URL

Prüfungsdatum der Dissertation

Finanzierungsart

Kommentar zur Publikation

Allianzlizenz
Corresponding Authors der Uni Konstanz vorhanden
Internationale Co-Autor:innen
Universitätsbibliographie
Begutachtet
Ja
Diese Publikation teilen