YOLO‐Behaviour : A simple, flexible framework to automatically quantify animal behaviours from videos

dc.contributor.authorChan, Alex Hoi Hang
dc.contributor.authorPutra, Prasetia
dc.contributor.authorSchupp, Harald T.
dc.contributor.authorKöchling, Johanna
dc.contributor.authorStraßheim, Jana
dc.contributor.authorRenner, Britta
dc.contributor.authorGriesser, Michael
dc.contributor.authorMeltzer, Andrea
dc.contributor.authorLubrano, Saverio
dc.contributor.authorKano, Fumihiro
dc.date.accessioned2025-06-18T09:01:32Z
dc.date.available2025-06-18T09:01:32Z
dc.date.issued2025-04
dc.description.abstract1. Manually coding behaviours from videos is essential to study animal behaviour but it is labour‐intensive and susceptible to inter‐rater bias and reliability issues. Recent developments of computer vision tools enable the automatic quantification of behaviours, supplementing or even replacing manual annotation. However, widespread adoption of these methods is still limited, due to the lack of annotated training datasets and domain‐specific knowledge required to optimize these models for animal research. 2. Here, we present YOLO‐Behaviour, a flexible framework for identifying visually distinct behaviours from video recordings. The framework is robust, easy to implement, and requires minimal manual annotations as training data. We demonstrate the flexibility of the framework with case studies for event‐wise detection in house sparrow nestling provisioning, Siberian jay feeding, human eating behaviours and frame‐wise detections of various behaviours in pigeons, zebras and giraffes. 3. Our results show that the framework reliably detects behaviours accurately and retrieve comparable accuracy metrics to manual annotation. However, metrics extracted for event‐wise detection were less correlated with manual annotation, and potential reasons for the discrepancy between manual annotation and automatic detection are discussed. To mitigate this problem, the framework can be used as a hybrid approach of first detecting events using the pipeline and then manually confirming the detections, saving annotation time. 4. We provide detailed documentation and guidelines on how to implement the YOLO‐Behaviour framework, for researchers to readily train and deploy new models on their own study systems. We anticipate the framework can be another step towards lowering the barrier of entry for applying computer vision methods in animal behaviour.
dc.description.versionpublisheddeu
dc.identifier.doi10.1111/2041-210x.14502
dc.identifier.ppn1928550487
dc.identifier.urihttps://kops.uni-konstanz.de/handle/123456789/73625
dc.language.isoeng
dc.rightsAttribution 4.0 International
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/
dc.subjectanimal behaviour
dc.subjectbehavioural recognition
dc.subjectcomputer vision
dc.subjectmachine learning
dc.subject.ddc150
dc.titleYOLO‐Behaviour : A simple, flexible framework to automatically quantify animal behaviours from videoseng
dc.typeJOURNAL_ARTICLE
dspace.entity.typePublication
kops.citation.bibtex
@article{Chan2025-04YOLOB-73625,
  title={YOLO‐Behaviour : A simple, flexible framework to automatically quantify animal behaviours from videos},
  year={2025},
  doi={10.1111/2041-210x.14502},
  number={4},
  volume={16},
  issn={2041-2096},
  journal={Methods in Ecology and Evolution},
  pages={760--774},
  author={Chan, Hoi Hang and Putra, Prasetia and Schupp, Harald T. and Köchling, Johanna and Straßheim, Jana and Renner, Britta and Griesser, Michael and Meltzer, Andrea and Lubrano, Saverio and Kano, Fumihiro}
}
kops.citation.iso690CHAN, Hoi Hang, Prasetia PUTRA, Harald T. SCHUPP, Johanna KÖCHLING, Jana STRASSHEIM, Britta RENNER, Michael GRIESSER, Andrea MELTZER, Saverio LUBRANO, Fumihiro KANO, 2025. YOLO‐Behaviour : A simple, flexible framework to automatically quantify animal behaviours from videos. In: Methods in Ecology and Evolution. Wiley. 2025, 16(4), S. 760-774. ISSN 2041-2096. eISSN 2041-210X. Verfügbar unter: doi: 10.1111/2041-210x.14502deu
kops.citation.iso690CHAN, Hoi Hang, Prasetia PUTRA, Harald T. SCHUPP, Johanna KÖCHLING, Jana STRASSHEIM, Britta RENNER, Michael GRIESSER, Andrea MELTZER, Saverio LUBRANO, Fumihiro KANO, 2025. YOLO‐Behaviour : A simple, flexible framework to automatically quantify animal behaviours from videos. In: Methods in Ecology and Evolution. Wiley. 2025, 16(4), pp. 760-774. ISSN 2041-2096. eISSN 2041-210X. Available under: doi: 10.1111/2041-210x.14502eng
kops.citation.rdf
<rdf:RDF
    xmlns:dcterms="http://purl.org/dc/terms/"
    xmlns:dc="http://purl.org/dc/elements/1.1/"
    xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
    xmlns:bibo="http://purl.org/ontology/bibo/"
    xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#"
    xmlns:foaf="http://xmlns.com/foaf/0.1/"
    xmlns:void="http://rdfs.org/ns/void#"
    xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > 
  <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/73625">
    <dc:contributor>Putra, Prasetia</dc:contributor>
    <foaf:homepage rdf:resource="http://localhost:8080/"/>
    <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/>
    <dc:contributor>Köchling, Johanna</dc:contributor>
    <dcterms:rights rdf:resource="http://creativecommons.org/licenses/by/4.0/"/>
    <dc:contributor>Straßheim, Jana</dc:contributor>
    <dc:contributor>Lubrano, Saverio</dc:contributor>
    <dc:contributor>Griesser, Michael</dc:contributor>
    <dc:contributor>Chan, Hoi Hang</dc:contributor>
    <dc:contributor>Schupp, Harald T.</dc:contributor>
    <dc:creator>Lubrano, Saverio</dc:creator>
    <dc:rights>Attribution 4.0 International</dc:rights>
    <dspace:hasBitstream rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/73625/1/Chan_2-1rnsbhfwg1d4n7.PDF"/>
    <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/43"/>
    <dcterms:hasPart rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/73625/1/Chan_2-1rnsbhfwg1d4n7.PDF"/>
    <dc:creator>Chan, Hoi Hang</dc:creator>
    <dc:creator>Meltzer, Andrea</dc:creator>
    <dc:language>eng</dc:language>
    <dcterms:issued>2025-04</dcterms:issued>
    <dc:contributor>Renner, Britta</dc:contributor>
    <bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/73625"/>
    <dcterms:title>YOLO‐Behaviour : A simple, flexible framework to automatically quantify animal behaviours from videos</dcterms:title>
    <dc:creator>Griesser, Michael</dc:creator>
    <dc:creator>Kano, Fumihiro</dc:creator>
    <dc:creator>Schupp, Harald T.</dc:creator>
    <dc:creator>Straßheim, Jana</dc:creator>
    <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/43615"/>
    <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/43"/>
    <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2025-06-18T09:01:32Z</dc:date>
    <dc:creator>Köchling, Johanna</dc:creator>
    <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2025-06-18T09:01:32Z</dcterms:available>
    <dc:contributor>Kano, Fumihiro</dc:contributor>
    <dc:creator>Renner, Britta</dc:creator>
    <dcterms:abstract>1. Manually coding behaviours from videos is essential to study animal behaviour but it is labour‐intensive and susceptible to inter‐rater bias and reliability issues. Recent developments of computer vision tools enable the automatic quantification of behaviours, supplementing or even replacing manual annotation. However, widespread adoption of these methods is still limited, due to the lack of annotated training datasets and domain‐specific knowledge required to optimize these models for animal research. 
2. Here, we present YOLO‐Behaviour, a flexible framework for identifying visually distinct behaviours from video recordings. The framework is robust, easy to implement, and requires minimal manual annotations as training data. We demonstrate the flexibility of the framework with case studies for event‐wise detection in house sparrow nestling provisioning, Siberian jay feeding, human eating behaviours and frame‐wise detections of various behaviours in pigeons, zebras and giraffes. 
3. Our results show that the framework reliably detects behaviours accurately and retrieve comparable accuracy metrics to manual annotation. However, metrics extracted for event‐wise detection were less correlated with manual annotation, and potential reasons for the discrepancy between manual annotation and automatic detection are discussed. To mitigate this problem, the framework can be used as a hybrid approach of first detecting events using the pipeline and then manually confirming the detections, saving annotation time. 
4. We provide detailed documentation and guidelines on how to implement the YOLO‐Behaviour framework, for researchers to readily train and deploy new models on their own study systems. We anticipate the framework can be another step towards lowering the barrier of entry for applying computer vision methods in animal behaviour.</dcterms:abstract>
    <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/43615"/>
    <dc:contributor>Meltzer, Andrea</dc:contributor>
    <dc:creator>Putra, Prasetia</dc:creator>
  </rdf:Description>
</rdf:RDF>
kops.description.funding{"first":"dfg","second":"EXC 2117―422037984"}
kops.description.funding{"first":"dfg","second":"GR 4650/2‐1"}
kops.description.openAccessopenaccessgold
kops.flag.etalAuthortrue
kops.flag.isPeerReviewedtrue
kops.flag.knbibliographytrue
kops.identifier.nbnurn:nbn:de:bsz:352-2-1rnsbhfwg1d4n7
kops.sourcefieldMethods in Ecology and Evolution. Wiley. 2025, <b>16</b>(4), S. 760-774. ISSN 2041-2096. eISSN 2041-210X. Verfügbar unter: doi: 10.1111/2041-210x.14502deu
kops.sourcefield.plainMethods in Ecology and Evolution. Wiley. 2025, 16(4), S. 760-774. ISSN 2041-2096. eISSN 2041-210X. Verfügbar unter: doi: 10.1111/2041-210x.14502deu
kops.sourcefield.plainMethods in Ecology and Evolution. Wiley. 2025, 16(4), pp. 760-774. ISSN 2041-2096. eISSN 2041-210X. Available under: doi: 10.1111/2041-210x.14502eng
relation.isAuthorOfPublication231939e9-a080-4f80-8e50-7f085d3588c5
relation.isAuthorOfPublication7b852f13-d877-4a09-b5a1-895a91bab745
relation.isAuthorOfPublication8d9d2e37-bf2e-4018-804b-277d8476fa90
relation.isAuthorOfPublication4d85fb9b-46ff-4f16-afa3-2aa46ef83e77
relation.isAuthorOfPublication8132dc50-b9ba-417d-a66d-90e0cf793e41
relation.isAuthorOfPublicationf8fce4a8-7c91-4907-9878-5ccc5f900998
relation.isAuthorOfPublicationc2ebcb2b-ec9a-4a8f-aa82-f359d154d908
relation.isAuthorOfPublicationfdfb0618-223f-4683-ba77-4f35550b9877
relation.isAuthorOfPublication4efab9b6-0bbd-41df-b02e-d9bf7256880a
relation.isAuthorOfPublication98a77749-fce3-446e-842e-255a2620a8ff
relation.isAuthorOfPublication.latestForDiscovery231939e9-a080-4f80-8e50-7f085d3588c5
relation.isDatasetOfPublication9e526b59-e3a8-472d-8926-02de24d7eee9
relation.isDatasetOfPublication9398a3e0-0c40-432c-a36c-2f457f2c57c5
relation.isDatasetOfPublication.latestForDiscovery9e526b59-e3a8-472d-8926-02de24d7eee9
source.bibliographicInfo.fromPage760
source.bibliographicInfo.issue4
source.bibliographicInfo.toPage774
source.bibliographicInfo.volume16
source.identifier.eissn2041-210X
source.identifier.issn2041-2096
source.periodicalTitleMethods in Ecology and Evolution
source.publisherWiley
temp.internal.duplicatesitems/9e526b59-e3a8-472d-8926-02de24d7eee9;true;Sample Dataset for YOLO-Behaviour: A simple, flexible framework to automatically quantify animal behaviours from videos

Dateien

Originalbündel

Gerade angezeigt 1 - 1 von 1
Vorschaubild nicht verfügbar
Name:
Chan_2-1rnsbhfwg1d4n7.PDF
Größe:
2.28 MB
Format:
Adobe Portable Document Format
Chan_2-1rnsbhfwg1d4n7.PDF
Chan_2-1rnsbhfwg1d4n7.PDFGröße: 2.28 MBDownloads: 62