Publikation: Peering into the world of wild passerines with 3D-SOCS : Synchronized video capture and posture estimation
Dateien
Datum
Herausgeber:innen
ISSN der Zeitschrift
Electronic ISSN
ISBN
Bibliografische Daten
Verlag
Schriftenreihe
Auflagebezeichnung
DOI (zitierfähiger Link)
Internationale Patentnummer
Angaben zur Forschungsförderung
Projekt
Open Access-Veröffentlichung
Core Facility der Universität Konstanz
Titel in einer weiteren Sprache
Publikationstyp
Publikationsstatus
Erschienen in
Zusammenfassung
-
Collection of large behavioural data-sets on wild animals in natural habitats is vital in ecology and evolution studies. Recent progress in machine learning and computer vision, combined with inexpensive microcomputers, has unlocked a new frontier of fine-scale markerless measurements.
-
Here, we leverage these advancements to develop a 3D Synchronized Outdoor Camera System (3D-SOCS): an inexpensive, mobile and automated method for collecting behavioural data on wild animals using synchronized video frames from Raspberry Pi controlled cameras. Accuracy tests demonstrate 3D-SOCS' markerless tracking can estimate postures with a 3 mm tolerance.
-
To illustrate its research potential, we place 3D-SOCS in the field and conduct a stimulus presentation experiment. We estimate 3D postures and trajectories for multiple individuals of different bird species, and use this data to characterize the visual field configuration of wild great tits (Parus major), a model species in behavioural ecology. We find their optic axes at ~±60° azimuth and −5° elevation. Furthermore, birds exhibit functional lateralization in their use of the right eye with conspecific stimulus, and show individual differences in lateralization. We also show that birds' convex hulls predicts body weight, highlighting 3D-SOCS' potential for non-invasive population monitoring.
-
3D-SOCS is a first-of-its-kind camera system for wild research, presenting exciting potential to measure fine-scaled behaviour and morphology in wild birds.
Zusammenfassung in einer weiteren Sprache
Fachgebiet (DDC)
Schlagwörter
Konferenz
Rezension
Zitieren
ISO 690
CHIMENTO, Michael, Hoi Hang CHAN, Lucy M. APLIN, Fumihiro KANO, 2025. Peering into the world of wild passerines with 3D-SOCS : Synchronized video capture and posture estimation. In: Methods in Ecology and Evolution. Wiley. ISSN 2041-2096. eISSN 2041-210X. Verfügbar unter: doi: 10.1111/2041-210x.70051BibTex
@article{Chimento2025-06-11Peeri-73694, title={Peering into the world of wild passerines with 3D-SOCS : Synchronized video capture and posture estimation}, year={2025}, doi={10.1111/2041-210x.70051}, issn={2041-2096}, journal={Methods in Ecology and Evolution}, author={Chimento, Michael and Chan, Hoi Hang and Aplin, Lucy M. and Kano, Fumihiro} }
RDF
<rdf:RDF xmlns:dcterms="http://purl.org/dc/terms/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:bibo="http://purl.org/ontology/bibo/" xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#" xmlns:foaf="http://xmlns.com/foaf/0.1/" xmlns:void="http://rdfs.org/ns/void#" xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/73694"> <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2025-06-25T07:13:44Z</dcterms:available> <dc:creator>Chan, Hoi Hang</dc:creator> <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/43615"/> <dc:creator>Chimento, Michael</dc:creator> <dc:contributor>Chan, Hoi Hang</dc:contributor> <dc:creator>Aplin, Lucy M.</dc:creator> <dcterms:abstract>1. Collection of large behavioural data-sets on wild animals in natural habitats is vital in ecology and evolution studies. Recent progress in machine learning and computer vision, combined with inexpensive microcomputers, has unlocked a new frontier of fine-scale markerless measurements. 2. Here, we leverage these advancements to develop a 3D Synchronized Outdoor Camera System (3D-SOCS): an inexpensive, mobile and automated method for collecting behavioural data on wild animals using synchronized video frames from Raspberry Pi controlled cameras. Accuracy tests demonstrate 3D-SOCS' markerless tracking can estimate postures with a 3 mm tolerance. 3. To illustrate its research potential, we place 3D-SOCS in the field and conduct a stimulus presentation experiment. We estimate 3D postures and trajectories for multiple individuals of different bird species, and use this data to characterize the visual field configuration of wild great tits (Parus major), a model species in behavioural ecology. We find their optic axes at ~±60° azimuth and −5° elevation. Furthermore, birds exhibit functional lateralization in their use of the right eye with conspecific stimulus, and show individual differences in lateralization. We also show that birds' convex hulls predicts body weight, highlighting 3D-SOCS' potential for non-invasive population monitoring. 4. 3D-SOCS is a first-of-its-kind camera system for wild research, presenting exciting potential to measure fine-scaled behaviour and morphology in wild birds.</dcterms:abstract> <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/> <dc:language>eng</dc:language> <foaf:homepage rdf:resource="http://localhost:8080/"/> <dc:rights>Attribution-NonCommercial 4.0 International</dc:rights> <dc:contributor>Kano, Fumihiro</dc:contributor> <dc:creator>Kano, Fumihiro</dc:creator> <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/43615"/> <dcterms:rights rdf:resource="http://creativecommons.org/licenses/by-nc/4.0/"/> <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2025-06-25T07:13:44Z</dc:date> <bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/73694"/> <dcterms:title>Peering into the world of wild passerines with 3D-SOCS : Synchronized video capture and posture estimation</dcterms:title> <dcterms:issued>2025-06-11</dcterms:issued> <dc:contributor>Aplin, Lucy M.</dc:contributor> <dc:contributor>Chimento, Michael</dc:contributor> </rdf:Description> </rdf:RDF>