Visual capture of gait during redirected walking
Dateien
Datum
Autor:innen
Herausgeber:innen
ISSN der Zeitschrift
Electronic ISSN
ISBN
Bibliografische Daten
Verlag
Schriftenreihe
Auflagebezeichnung
URI (zitierfähiger Link)
DOI (zitierfähiger Link)
Internationale Patentnummer
Link zur Lizenz
Angaben zur Forschungsförderung
Projekt
Open Access-Veröffentlichung
Sammlungen
Core Facility der Universität Konstanz
Titel in einer weiteren Sprache
Publikationstyp
Publikationsstatus
Erschienen in
Zusammenfassung
Redirected walking allows users of virtual reality applications to explore virtual environments larger than the available physical space. This is achieved by manipulating users' walking trajectories through visual rotation of the virtual surroundings, without users noticing this manipulation. Apart from its applied relevance, redirected walking is an attractive paradigm to investigate human perception and locomotion. An important yet unsolved question concerns individual differences in the ability to detect redirection. Addressing this question, we administered several perceptual-cognitive tasks to healthy participants, whose thresholds of detecting redirection in a virtual environment were also determined. We report relations between individual thresholds and measures of multisensory weighting (visually-assisted postural stability (Romberg quotient), subjective visual vertical (rod-and-frame test) and illusory self-motion (vection)). The performance in the rod-and-frame test, a classical measure of visual dependency regarding postural information, showed the strongest relation to redirection detection thresholds: The higher the visual dependency, the higher the detection threshold. This supports the interpretation of users' neglect of redirection manipulations as a "visual capture of gait". We discuss how future interdisciplinary studies, merging the fields of virtual reality and psychology, may help improving virtual reality applications and simultaneously deepen our understanding of how humans process multisensory conflicts during locomotion.
Zusammenfassung in einer weiteren Sprache
Fachgebiet (DDC)
Schlagwörter
Konferenz
Rezension
Zitieren
ISO 690
ROTHACHER, Yannick, Anh NGUYEN, Bigna LENGGENHAGER, Andreas KUNZ, Peter BRUGGER, 2018. Visual capture of gait during redirected walking. In: Scientific Reports. Springer Nature. 2018, 8, 17974. eISSN 2045-2322. Available under: doi: 10.1038/s41598-018-36035-6BibTex
@article{Rothacher2018Visua-57173, year={2018}, doi={10.1038/s41598-018-36035-6}, title={Visual capture of gait during redirected walking}, volume={8}, journal={Scientific Reports}, author={Rothacher, Yannick and Nguyen, Anh and Lenggenhager, Bigna and Kunz, Andreas and Brugger, Peter}, note={Article Number: 17974} }
RDF
<rdf:RDF xmlns:dcterms="http://purl.org/dc/terms/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:bibo="http://purl.org/ontology/bibo/" xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#" xmlns:foaf="http://xmlns.com/foaf/0.1/" xmlns:void="http://rdfs.org/ns/void#" xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/57173"> <dc:creator>Kunz, Andreas</dc:creator> <bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/57173"/> <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/43"/> <dc:contributor>Rothacher, Yannick</dc:contributor> <dc:contributor>Brugger, Peter</dc:contributor> <dc:language>eng</dc:language> <dc:contributor>Nguyen, Anh</dc:contributor> <dcterms:issued>2018</dcterms:issued> <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/43"/> <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2022-04-05T11:14:04Z</dcterms:available> <dcterms:title>Visual capture of gait during redirected walking</dcterms:title> <foaf:homepage rdf:resource="http://localhost:8080/"/> <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/> <dcterms:hasPart rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/57173/3/Rothacher_2-f3et2ep1imuk4.pdf"/> <dc:contributor>Kunz, Andreas</dc:contributor> <dspace:hasBitstream rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/57173/3/Rothacher_2-f3et2ep1imuk4.pdf"/> <dc:creator>Brugger, Peter</dc:creator> <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2022-04-05T11:14:04Z</dc:date> <dc:rights>Attribution 4.0 International</dc:rights> <dcterms:rights rdf:resource="http://creativecommons.org/licenses/by/4.0/"/> <dc:creator>Lenggenhager, Bigna</dc:creator> <dc:contributor>Lenggenhager, Bigna</dc:contributor> <dcterms:abstract xml:lang="eng">Redirected walking allows users of virtual reality applications to explore virtual environments larger than the available physical space. This is achieved by manipulating users' walking trajectories through visual rotation of the virtual surroundings, without users noticing this manipulation. Apart from its applied relevance, redirected walking is an attractive paradigm to investigate human perception and locomotion. An important yet unsolved question concerns individual differences in the ability to detect redirection. Addressing this question, we administered several perceptual-cognitive tasks to healthy participants, whose thresholds of detecting redirection in a virtual environment were also determined. We report relations between individual thresholds and measures of multisensory weighting (visually-assisted postural stability (Romberg quotient), subjective visual vertical (rod-and-frame test) and illusory self-motion (vection)). The performance in the rod-and-frame test, a classical measure of visual dependency regarding postural information, showed the strongest relation to redirection detection thresholds: The higher the visual dependency, the higher the detection threshold. This supports the interpretation of users' neglect of redirection manipulations as a "visual capture of gait". We discuss how future interdisciplinary studies, merging the fields of virtual reality and psychology, may help improving virtual reality applications and simultaneously deepen our understanding of how humans process multisensory conflicts during locomotion.</dcterms:abstract> <dc:creator>Nguyen, Anh</dc:creator> <dc:creator>Rothacher, Yannick</dc:creator> </rdf:Description> </rdf:RDF>