AI-smartphone markerless motion capturing of hip, knee, and ankle joint kinematics during countermovement jumps
Dateien
Datum
Herausgeber:innen
ISSN der Zeitschrift
Electronic ISSN
ISBN
Bibliografische Daten
Verlag
Schriftenreihe
Auflagebezeichnung
DOI (zitierfähiger Link)
Internationale Patentnummer
Angaben zur Forschungsförderung
Projekt
Open Access-Veröffentlichung
Core Facility der Universität Konstanz
Titel in einer weiteren Sprache
Publikationstyp
Publikationsstatus
Erschienen in
Zusammenfassung
Recently, AI-driven skeleton reconstruction tools that use multistage computer vision pipelines were designed to estimate 3D kinematics from 2D video sequences. In the present study, we validated a novel markerless, smartphone video-based artificial intelligence (AI) motion capture system for hip, knee, and ankle angles during countermovement jumps (CMJs). Eleven participants performed six CMJs. We used 2D videos created by a smartphone (Apple iPhone X, 4K, 60 fps) to create 24 different keypoints, which together built a full skeleton including joints and their connections. Body parts and skeletal keypoints were localized by calculating confidence maps using a multilevel convolutional neural network that integrated both spatial and temporal features. We calculated hip, knee, and ankle angles in the sagittal plane and compared it with the angles measured by a VICON system. We calculated the correlation between both method's angular progressions, mean squared error (MSE), mean average error (MAE), and the maximum and minimum angular error and run statistical parametric mapping (SPM) analysis. Pearson correlation coefficients (r) for hip, knee, and ankle angular progressions in the sagittal plane during the entire movement were 0.96, 0.99, and 0.87, respectively. SPM group-analysis revealed some significant differences only for ankle angular progression. MSE was below 5.7°, MAE was below 4.5°, and error for maximum amplitudes was below 3.2°. The smartphone AI motion capture system with the trained multistage computer vision pipeline was able to detect, especially hip and knee angles in the sagittal plane during CMJs with high precision from a frontal view only.
Zusammenfassung in einer weiteren Sprache
Fachgebiet (DDC)
Schlagwörter
Konferenz
Rezension
Zitieren
ISO 690
BARZYK, Philipp, Philip ZIMMERMANN, Manuel STEIN, Daniel A. KEIM, Markus GRUBER, 2024. AI-smartphone markerless motion capturing of hip, knee, and ankle joint kinematics during countermovement jumps. In: European Journal of Sport Science. Wiley. ISSN 1746-1391. eISSN 1536-7290. Verfügbar unter: doi: 10.1002/ejsc.12186BibTex
@article{Barzyk2024-08-28AIsma-70762, year={2024}, doi={10.1002/ejsc.12186}, title={AI-smartphone markerless motion capturing of hip, knee, and ankle joint kinematics during countermovement jumps}, issn={1746-1391}, journal={European Journal of Sport Science}, author={Barzyk, Philipp and Zimmermann, Philip and Stein, Manuel and Keim, Daniel A. and Gruber, Markus} }
RDF
<rdf:RDF xmlns:dcterms="http://purl.org/dc/terms/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:bibo="http://purl.org/ontology/bibo/" xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#" xmlns:foaf="http://xmlns.com/foaf/0.1/" xmlns:void="http://rdfs.org/ns/void#" xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/70762"> <dc:contributor>Gruber, Markus</dc:contributor> <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/> <dc:contributor>Keim, Daniel A.</dc:contributor> <dcterms:issued>2024-08-28</dcterms:issued> <dc:contributor>Barzyk, Philipp</dc:contributor> <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/> <dc:creator>Barzyk, Philipp</dc:creator> <dc:creator>Keim, Daniel A.</dc:creator> <dc:contributor>Stein, Manuel</dc:contributor> <dc:creator>Zimmermann, Philip</dc:creator> <dc:creator>Stein, Manuel</dc:creator> <dc:creator>Gruber, Markus</dc:creator> <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/35"/> <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/> <dcterms:title>AI-smartphone markerless motion capturing of hip, knee, and ankle joint kinematics during countermovement jumps</dcterms:title> <foaf:homepage rdf:resource="http://localhost:8080/"/> <dc:language>eng</dc:language> <dc:contributor>Zimmermann, Philip</dc:contributor> <dcterms:abstract>Recently, AI-driven skeleton reconstruction tools that use multistage computer vision pipelines were designed to estimate 3D kinematics from 2D video sequences. In the present study, we validated a novel markerless, smartphone video-based artificial intelligence (AI) motion capture system for hip, knee, and ankle angles during countermovement jumps (CMJs). Eleven participants performed six CMJs. We used 2D videos created by a smartphone (Apple iPhone X, 4K, 60 fps) to create 24 different keypoints, which together built a full skeleton including joints and their connections. Body parts and skeletal keypoints were localized by calculating confidence maps using a multilevel convolutional neural network that integrated both spatial and temporal features. We calculated hip, knee, and ankle angles in the sagittal plane and compared it with the angles measured by a VICON system. We calculated the correlation between both method's angular progressions, mean squared error (MSE), mean average error (MAE), and the maximum and minimum angular error and run statistical parametric mapping (SPM) analysis. Pearson correlation coefficients (r) for hip, knee, and ankle angular progressions in the sagittal plane during the entire movement were 0.96, 0.99, and 0.87, respectively. SPM group-analysis revealed some significant differences only for ankle angular progression. MSE was below 5.7°, MAE was below 4.5°, and error for maximum amplitudes was below 3.2°. The smartphone AI motion capture system with the trained multistage computer vision pipeline was able to detect, especially hip and knee angles in the sagittal plane during CMJs with high precision from a frontal view only.</dcterms:abstract> <bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/70762"/> <dcterms:rights rdf:resource="http://creativecommons.org/licenses/by/4.0/"/> <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2024-09-11T06:22:07Z</dc:date> <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/35"/> <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2024-09-11T06:22:07Z</dcterms:available> <dc:rights>Attribution 4.0 International</dc:rights> </rdf:Description> </rdf:RDF>