Publikation: Estimation of the visual contribution to standing balance using virtual reality
Dateien
Datum
Autor:innen
Herausgeber:innen
ISSN der Zeitschrift
Electronic ISSN
ISBN
Bibliografische Daten
Verlag
Schriftenreihe
Auflagebezeichnung
URI (zitierfähiger Link)
DOI (zitierfähiger Link)
Internationale Patentnummer
Link zur Lizenz
Angaben zur Forschungsförderung
Projekt
Open Access-Veröffentlichung
Core Facility der Universität Konstanz
Titel in einer weiteren Sprache
Publikationstyp
Publikationsstatus
Erschienen in
Zusammenfassung
Sensory perturbations are a valuable tool to assess sensory integration mechanisms underlying balance. Implemented as systems-identification approaches, they can be used to quantitatively assess balance deficits and separate underlying causes. However, the experiments require controlled perturbations and sophisticated modeling and optimization techniques. Here we propose and validate a virtual reality implementation of moving visual scene experiments together with model-based interpretations of the results. The approach simplifies the experimental implementation and offers a platform to implement standardized analysis routines. Sway of 14 healthy young subjects wearing a virtual reality head-mounted display was measured. Subjects viewed a virtual room or a screen inside the room, which were both moved during a series of sinusoidal or pseudo-random room or screen tilt sequences recorded on two days. In a between-subject comparison of 10 × 6 min long pseudo-random sequences, each applied at 5 amplitudes, our results showed no difference to a real-world moving screen experiment from the literature. We used the independent-channel model to interpret our data, which provides a direct estimate of the visual contribution to balance, together with parameters characterizing the dynamics of the feedback system. Reliability estimates of single subject parameters from six repetitions of a 6 × 20-s pseudo-random sequence showed poor test–retest agreement. Estimated parameters show excellent reliability when averaging across three repetitions within each day and comparing across days (Intra-class correlation; ICC 0.7–0.9 for visual weight, time delay and feedback gain). Sway responses strongly depended on the visual scene, where the high-contrast, abstract screen evoked larger sway as compared to the photo-realistic room. In conclusion, our proposed virtual reality approach allows researchers to reliably assess balance control dynamics including the visual contribution to balance with minimal implementation effort.
Zusammenfassung in einer weiteren Sprache
Fachgebiet (DDC)
Schlagwörter
Konferenz
Rezension
Zitieren
ISO 690
ASSLÄNDER, Lorenz, Matthias ALBRECHT, Moritz DIEHL, Kyle J. MISSEN, Mark G. CARPENTER, Stephan STREUBER, 2023. Estimation of the visual contribution to standing balance using virtual reality. In: Scientific Reports. Springer. 2023, 13(1), 2594. eISSN 2045-2322. Available under: doi: 10.1038/s41598-023-29713-7BibTex
@article{Asslander2023-02-14Estim-66491, year={2023}, doi={10.1038/s41598-023-29713-7}, title={Estimation of the visual contribution to standing balance using virtual reality}, number={1}, volume={13}, journal={Scientific Reports}, author={Assländer, Lorenz and Albrecht, Matthias and Diehl, Moritz and Missen, Kyle J. and Carpenter, Mark G. and Streuber, Stephan}, note={Article Number: 2594} }
RDF
<rdf:RDF xmlns:dcterms="http://purl.org/dc/terms/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:bibo="http://purl.org/ontology/bibo/" xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#" xmlns:foaf="http://xmlns.com/foaf/0.1/" xmlns:void="http://rdfs.org/ns/void#" xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/66491"> <dc:creator>Streuber, Stephan</dc:creator> <dcterms:issued>2023-02-14</dcterms:issued> <bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/66491"/> <dc:creator>Missen, Kyle J.</dc:creator> <dc:contributor>Carpenter, Mark G.</dc:contributor> <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/> <dc:creator>Assländer, Lorenz</dc:creator> <dc:contributor>Missen, Kyle J.</dc:contributor> <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2023-03-29T14:44:15Z</dcterms:available> <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/> <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/35"/> <dc:language>eng</dc:language> <dc:contributor>Diehl, Moritz</dc:contributor> <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2023-03-29T14:44:15Z</dc:date> <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/> <dcterms:rights rdf:resource="http://creativecommons.org/licenses/by/4.0/"/> <dcterms:title>Estimation of the visual contribution to standing balance using virtual reality</dcterms:title> <dc:rights>Attribution 4.0 International</dc:rights> <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/35"/> <foaf:homepage rdf:resource="http://localhost:8080/"/> <dc:creator>Diehl, Moritz</dc:creator> <dcterms:abstract>Sensory perturbations are a valuable tool to assess sensory integration mechanisms underlying balance. Implemented as systems-identification approaches, they can be used to quantitatively assess balance deficits and separate underlying causes. However, the experiments require controlled perturbations and sophisticated modeling and optimization techniques. Here we propose and validate a virtual reality implementation of moving visual scene experiments together with model-based interpretations of the results. The approach simplifies the experimental implementation and offers a platform to implement standardized analysis routines. Sway of 14 healthy young subjects wearing a virtual reality head-mounted display was measured. Subjects viewed a virtual room or a screen inside the room, which were both moved during a series of sinusoidal or pseudo-random room or screen tilt sequences recorded on two days. In a between-subject comparison of 10 × 6 min long pseudo-random sequences, each applied at 5 amplitudes, our results showed no difference to a real-world moving screen experiment from the literature. We used the independent-channel model to interpret our data, which provides a direct estimate of the visual contribution to balance, together with parameters characterizing the dynamics of the feedback system. Reliability estimates of single subject parameters from six repetitions of a 6 × 20-s pseudo-random sequence showed poor test–retest agreement. Estimated parameters show excellent reliability when averaging across three repetitions within each day and comparing across days (Intra-class correlation; ICC 0.7–0.9 for visual weight, time delay and feedback gain). Sway responses strongly depended on the visual scene, where the high-contrast, abstract screen evoked larger sway as compared to the photo-realistic room. In conclusion, our proposed virtual reality approach allows researchers to reliably assess balance control dynamics including the visual contribution to balance with minimal implementation effort.</dcterms:abstract> <dcterms:hasPart rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/66491/1/Asslaender_2-1no104hovoesb7.pdf"/> <dc:contributor>Assländer, Lorenz</dc:contributor> <dc:contributor>Streuber, Stephan</dc:contributor> <dc:creator>Carpenter, Mark G.</dc:creator> <dc:creator>Albrecht, Matthias</dc:creator> <dc:contributor>Albrecht, Matthias</dc:contributor> <dspace:hasBitstream rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/66491/1/Asslaender_2-1no104hovoesb7.pdf"/> </rdf:Description> </rdf:RDF>