Publikation:

YOLO‐Behaviour : A simple, flexible framework to automatically quantify animal behaviours from videos

Lade...
Vorschaubild

Dateien

Chan_2-1rnsbhfwg1d4n7.PDF
Chan_2-1rnsbhfwg1d4n7.PDFGröße: 2.28 MBDownloads: 7

Datum

2025

Herausgeber:innen

Kontakt

ISSN der Zeitschrift

Electronic ISSN

ISBN

Bibliografische Daten

Verlag

Schriftenreihe

Auflagebezeichnung

ArXiv-ID

Internationale Patentnummer

Link zur Lizenz

Angaben zur Forschungsförderung

Deutsche Forschungsgemeinschaft (DFG): EXC 2117―422037984
Deutsche Forschungsgemeinschaft (DFG): GR 4650/2‐1

Projekt

Open Access-Veröffentlichung
Open Access Gold
Core Facility der Universität Konstanz

Gesperrt bis

Titel in einer weiteren Sprache

Publikationstyp
Zeitschriftenartikel
Publikationsstatus
Published

Erschienen in

Methods in Ecology and Evolution. Wiley. 2025, 16(4), S. 760-774. ISSN 2041-2096. eISSN 2041-210X. Verfügbar unter: doi: 10.1111/2041-210x.14502

Zusammenfassung

  1. Manually coding behaviours from videos is essential to study animal behaviour but it is labour‐intensive and susceptible to inter‐rater bias and reliability issues. Recent developments of computer vision tools enable the automatic quantification of behaviours, supplementing or even replacing manual annotation. However, widespread adoption of these methods is still limited, due to the lack of annotated training datasets and domain‐specific knowledge required to optimize these models for animal research.
  2. Here, we present YOLO‐Behaviour, a flexible framework for identifying visually distinct behaviours from video recordings. The framework is robust, easy to implement, and requires minimal manual annotations as training data. We demonstrate the flexibility of the framework with case studies for event‐wise detection in house sparrow nestling provisioning, Siberian jay feeding, human eating behaviours and frame‐wise detections of various behaviours in pigeons, zebras and giraffes.
  3. Our results show that the framework reliably detects behaviours accurately and retrieve comparable accuracy metrics to manual annotation. However, metrics extracted for event‐wise detection were less correlated with manual annotation, and potential reasons for the discrepancy between manual annotation and automatic detection are discussed. To mitigate this problem, the framework can be used as a hybrid approach of first detecting events using the pipeline and then manually confirming the detections, saving annotation time.
  4. We provide detailed documentation and guidelines on how to implement the YOLO‐Behaviour framework, for researchers to readily train and deploy new models on their own study systems. We anticipate the framework can be another step towards lowering the barrier of entry for applying computer vision methods in animal behaviour.

Zusammenfassung in einer weiteren Sprache

Fachgebiet (DDC)
150 Psychologie

Schlagwörter

animal behaviour, behavioural recognition, computer vision, machine learning

Konferenz

Rezension
undefined / . - undefined, undefined

Forschungsvorhaben

Organisationseinheiten

Zeitschriftenheft

Zugehörige Datensätze in KOPS

Datensatz
Sample Dataset for YOLO-Behaviour: A simple, flexible framework to automatically quantify animal behaviours from videos
(VV1, 2024) Chan, Hoi Hang; Putra, Prasetia; Schupp, Harald T.; Köchling, Johanna; Straßheim, Jana; Renner, Britta; Schroeder, Julia; Pearse, William D.; Nakagawa, Shinichi; Burke, Terry; Griesser, Michael; Meltzer, Andrea; Lubrano, Saverio; Kano, Fumihiro

Zitieren

ISO 690CHAN, Hoi Hang, Prasetia PUTRA, Harald T. SCHUPP, Johanna KÖCHLING, Jana STRASSHEIM, Britta RENNER, Michael GRIESSER, Andrea MELTZER, Saverio LUBRANO, Fumihiro KANO, 2025. YOLO‐Behaviour : A simple, flexible framework to automatically quantify animal behaviours from videos. In: Methods in Ecology and Evolution. Wiley. 2025, 16(4), S. 760-774. ISSN 2041-2096. eISSN 2041-210X. Verfügbar unter: doi: 10.1111/2041-210x.14502
BibTex
@article{Chan2025-04YOLOB-73625,
  title={YOLO‐Behaviour : A simple, flexible framework to automatically quantify animal behaviours from videos},
  year={2025},
  doi={10.1111/2041-210x.14502},
  number={4},
  volume={16},
  issn={2041-2096},
  journal={Methods in Ecology and Evolution},
  pages={760--774},
  author={Chan, Hoi Hang and Putra, Prasetia and Schupp, Harald T. and Köchling, Johanna and Straßheim, Jana and Renner, Britta and Griesser, Michael and Meltzer, Andrea and Lubrano, Saverio and Kano, Fumihiro}
}
RDF
<rdf:RDF
    xmlns:dcterms="http://purl.org/dc/terms/"
    xmlns:dc="http://purl.org/dc/elements/1.1/"
    xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
    xmlns:bibo="http://purl.org/ontology/bibo/"
    xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#"
    xmlns:foaf="http://xmlns.com/foaf/0.1/"
    xmlns:void="http://rdfs.org/ns/void#"
    xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > 
  <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/73625">
    <dc:contributor>Putra, Prasetia</dc:contributor>
    <foaf:homepage rdf:resource="http://localhost:8080/"/>
    <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/>
    <dc:contributor>Köchling, Johanna</dc:contributor>
    <dcterms:rights rdf:resource="http://creativecommons.org/licenses/by/4.0/"/>
    <dc:contributor>Straßheim, Jana</dc:contributor>
    <dc:contributor>Lubrano, Saverio</dc:contributor>
    <dc:contributor>Griesser, Michael</dc:contributor>
    <dc:contributor>Chan, Hoi Hang</dc:contributor>
    <dc:contributor>Schupp, Harald T.</dc:contributor>
    <dc:creator>Lubrano, Saverio</dc:creator>
    <dc:rights>Attribution 4.0 International</dc:rights>
    <dspace:hasBitstream rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/73625/1/Chan_2-1rnsbhfwg1d4n7.PDF"/>
    <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/43"/>
    <dcterms:hasPart rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/73625/1/Chan_2-1rnsbhfwg1d4n7.PDF"/>
    <dc:creator>Chan, Hoi Hang</dc:creator>
    <dc:creator>Meltzer, Andrea</dc:creator>
    <dc:language>eng</dc:language>
    <dcterms:issued>2025-04</dcterms:issued>
    <dc:contributor>Renner, Britta</dc:contributor>
    <bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/73625"/>
    <dcterms:title>YOLO‐Behaviour : A simple, flexible framework to automatically quantify animal behaviours from videos</dcterms:title>
    <dc:creator>Griesser, Michael</dc:creator>
    <dc:creator>Kano, Fumihiro</dc:creator>
    <dc:creator>Schupp, Harald T.</dc:creator>
    <dc:creator>Straßheim, Jana</dc:creator>
    <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/43615"/>
    <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/43"/>
    <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2025-06-18T09:01:32Z</dc:date>
    <dc:creator>Köchling, Johanna</dc:creator>
    <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2025-06-18T09:01:32Z</dcterms:available>
    <dc:contributor>Kano, Fumihiro</dc:contributor>
    <dc:creator>Renner, Britta</dc:creator>
    <dcterms:abstract>1. Manually coding behaviours from videos is essential to study animal behaviour but it is labour‐intensive and susceptible to inter‐rater bias and reliability issues. Recent developments of computer vision tools enable the automatic quantification of behaviours, supplementing or even replacing manual annotation. However, widespread adoption of these methods is still limited, due to the lack of annotated training datasets and domain‐specific knowledge required to optimize these models for animal research. 
2. Here, we present YOLO‐Behaviour, a flexible framework for identifying visually distinct behaviours from video recordings. The framework is robust, easy to implement, and requires minimal manual annotations as training data. We demonstrate the flexibility of the framework with case studies for event‐wise detection in house sparrow nestling provisioning, Siberian jay feeding, human eating behaviours and frame‐wise detections of various behaviours in pigeons, zebras and giraffes. 
3. Our results show that the framework reliably detects behaviours accurately and retrieve comparable accuracy metrics to manual annotation. However, metrics extracted for event‐wise detection were less correlated with manual annotation, and potential reasons for the discrepancy between manual annotation and automatic detection are discussed. To mitigate this problem, the framework can be used as a hybrid approach of first detecting events using the pipeline and then manually confirming the detections, saving annotation time. 
4. We provide detailed documentation and guidelines on how to implement the YOLO‐Behaviour framework, for researchers to readily train and deploy new models on their own study systems. We anticipate the framework can be another step towards lowering the barrier of entry for applying computer vision methods in animal behaviour.</dcterms:abstract>
    <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/43615"/>
    <dc:contributor>Meltzer, Andrea</dc:contributor>
    <dc:creator>Putra, Prasetia</dc:creator>
  </rdf:Description>
</rdf:RDF>

Interner Vermerk

xmlui.Submission.submit.DescribeStep.inputForms.label.kops_note_fromSubmitter

Kontakt
URL der Originalveröffentl.

Prüfdatum der URL

Prüfungsdatum der Dissertation

Finanzierungsart

Kommentar zur Publikation

Allianzlizenz
Corresponding Authors der Uni Konstanz vorhanden
Internationale Co-Autor:innen
Universitätsbibliographie
Ja
Begutachtet
Ja
Diese Publikation teilen