Publikation:

MoPeDT : A Modular Head-Mounted Display Toolkit to Conduct Peripheral Vision Research

Lade...
Vorschaubild

Dateien

Albrecht_2-1gjnldc1cak748.pdf
Albrecht_2-1gjnldc1cak748.pdfGröße: 23.86 MBDownloads: 73

Datum

2023

Herausgeber:innen

Kontakt

ISSN der Zeitschrift

Electronic ISSN

ISBN

Bibliografische Daten

Verlag

Schriftenreihe

Auflagebezeichnung

Internationale Patentnummer

Angaben zur Forschungsförderung

Projekt

Open Access-Veröffentlichung
Open Access Green
Core Facility der Universität Konstanz

Gesperrt bis

Titel in einer weiteren Sprache

Publikationstyp
Beitrag zu einem Konferenzband
Publikationsstatus
Published

Erschienen in

2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR). Piscataway, NJ: IEEE, 2023, pp. 691-701. ISBN 979-8-3503-4815-6. Available under: doi: 10.1109/VR55154.2023.00084

Zusammenfassung

Peripheral vision plays a significant role in human perception and orientation. However, its relevance for human-computer interaction, especially head-mounted displays, has not been fully explored yet. In the past, a few specialized appliances were developed to display visual cues in the periphery, each designed for a single specific use case only. A multi-purpose headset to exclusively augment peripheral vision did not exist yet. We introduce MoPeDT: Modular Peripheral Display Toolkit, a freely available, flexible, reconfigurable, and extendable headset to conduct peripheral vision research. MoPeDT can be built with a 3D printer and off-the-shelf components. It features multiple spatially configurable near-eye display modules and full 3D tracking inside and outside the lab. With our system, researchers and designers may easily develop and prototype novel peripheral vision interaction and visualization techniques. We demonstrate the versatility of our headset with several possible applications for spatial awareness, balance, interaction, feedback, and notifications. We conducted a small study to evaluate the usability of the system. We found that participants were largely not irritated by the peripheral cues, but the headset's comfort could be further improved. We also evaluated our system based on established heuristics for human-computer interaction toolkits to show how MoPeDT adapts to changing requirements, lowers the entry barrier for peripheral vision research, and facilitates expressive power in the combination of modular building blocks.

Zusammenfassung in einer weiteren Sprache

Fachgebiet (DDC)
004 Informatik

Schlagwörter

augmented reality, head-mounted display, peripheral vision, toolkit, prototyping, open-source

Konferenz

2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR), 25. März 2023 - 29. März 2023, Shanghai, China
Rezension
undefined / . - undefined, undefined

Forschungsvorhaben

Organisationseinheiten

Zeitschriftenheft

Zugehörige Datensätze in KOPS

Zitieren

ISO 690ALBRECHT, Matthias, Lorenz ASSLÄNDER, Harald REITERER, Stephan STREUBER, 2023. MoPeDT : A Modular Head-Mounted Display Toolkit to Conduct Peripheral Vision Research. 2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR). Shanghai, China, 25. März 2023 - 29. März 2023. In: 2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR). Piscataway, NJ: IEEE, 2023, pp. 691-701. ISBN 979-8-3503-4815-6. Available under: doi: 10.1109/VR55154.2023.00084
BibTex
@inproceedings{Albrecht2023MoPeD-66117,
  year={2023},
  doi={10.1109/VR55154.2023.00084},
  title={MoPeDT : A Modular Head-Mounted Display Toolkit to Conduct Peripheral Vision Research},
  isbn={979-8-3503-4815-6},
  publisher={IEEE},
  address={Piscataway, NJ},
  booktitle={2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)},
  pages={691--701},
  author={Albrecht, Matthias and Assländer, Lorenz and Reiterer, Harald and Streuber, Stephan}
}
RDF
<rdf:RDF
    xmlns:dcterms="http://purl.org/dc/terms/"
    xmlns:dc="http://purl.org/dc/elements/1.1/"
    xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
    xmlns:bibo="http://purl.org/ontology/bibo/"
    xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#"
    xmlns:foaf="http://xmlns.com/foaf/0.1/"
    xmlns:void="http://rdfs.org/ns/void#"
    xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > 
  <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/66117">
    <dc:contributor>Streuber, Stephan</dc:contributor>
    <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/>
    <dc:language>eng</dc:language>
    <bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/66117"/>
    <dc:creator>Assländer, Lorenz</dc:creator>
    <dcterms:rights rdf:resource="https://rightsstatements.org/page/InC/1.0/"/>
    <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/>
    <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2023-02-17T12:21:52Z</dcterms:available>
    <dc:contributor>Assländer, Lorenz</dc:contributor>
    <foaf:homepage rdf:resource="http://localhost:8080/"/>
    <dcterms:issued>2023</dcterms:issued>
    <dcterms:abstract xml:lang="en">Peripheral vision plays a significant role in human perception and orientation. However, its relevance for human-computer interaction, especially head-mounted displays, has not been fully explored yet. In the past, a few specialized appliances were developed to display visual cues in the periphery, each designed for a single specific use case only. A multi-purpose headset to exclusively augment peripheral vision did not exist yet. We introduce MoPeDT: Modular Peripheral Display Toolkit, a freely available, flexible, reconfigurable, and extendable headset to conduct peripheral vision research. MoPeDT can be built with a 3D printer and off-the-shelf components. It features multiple spatially configurable near-eye display modules and full 3D tracking inside and outside the lab. With our system, researchers and designers may easily develop and prototype novel peripheral vision interaction and visualization techniques. We demonstrate the versatility of our headset with several possible applications for spatial awareness, balance, interaction, feedback, and notifications. We conducted a small study to evaluate the usability of the system. We found that participants were largely not irritated by the peripheral cues, but the headset's comfort could be further improved. We also evaluated our system based on established heuristics for human-computer interaction toolkits to show how MoPeDT adapts to changing requirements, lowers the entry barrier for peripheral vision research, and facilitates expressive power in the combination of modular building blocks.</dcterms:abstract>
    <dc:contributor>Reiterer, Harald</dc:contributor>
    <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/>
    <dc:creator>Albrecht, Matthias</dc:creator>
    <dspace:hasBitstream rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/66117/1/Albrecht_2-1gjnldc1cak748.pdf"/>
    <dc:rights>terms-of-use</dc:rights>
    <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2023-02-17T12:21:52Z</dc:date>
    <dc:contributor>Albrecht, Matthias</dc:contributor>
    <dc:creator>Reiterer, Harald</dc:creator>
    <dcterms:hasPart rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/66117/1/Albrecht_2-1gjnldc1cak748.pdf"/>
    <dcterms:title>MoPeDT : A Modular Head-Mounted Display Toolkit to Conduct Peripheral Vision Research</dcterms:title>
    <dc:creator>Streuber, Stephan</dc:creator>
  </rdf:Description>
</rdf:RDF>

Interner Vermerk

xmlui.Submission.submit.DescribeStep.inputForms.label.kops_note_fromSubmitter

Kontakt
URL der Originalveröffentl.

Prüfdatum der URL

Prüfungsdatum der Dissertation

Finanzierungsart

Kommentar zur Publikation

Allianzlizenz
Corresponding Authors der Uni Konstanz vorhanden
Internationale Co-Autor:innen
Universitätsbibliographie
Ja
Begutachtet
Link zu Forschungsdaten
Beschreibung der Forschungsdaten
CAD files, BOM, and source code
Diese Publikation teilen