Publikation: Exploring Trajectory Data in Augmented Reality : A Comparative Study of Interaction Modalities
Dateien
Datum
Autor:innen
Herausgeber:innen
Kontakt
ISSN der Zeitschrift
item.preview.dc.identifier.eissn
ISBN
Bibliografische Daten
Verlag
Schriftenreihe
Auflagebezeichnung
URI (zitierfähiger Link)
DOI (zitierfähiger Link)
item.preview.dc.identifier.arxiv
Internationale Patentnummer
Link zur Lizenz
Angaben zur Forschungsförderung
Projekt
Open Access-Veröffentlichung
Core Facility der Universität Konstanz
Titel in einer weiteren Sprache
Publikationstyp
Publikationsstatus
Erschienen in
Zusammenfassung
The visual exploration of trajectory data is crucial in domains such as animal behavior, molecular dynamics, and transportation. With the emergence of immersive technology, trajectory data, which is often inherently three-dimensional, can be analyzed in stereoscopic 3D, providing new opportunities for perception, engagement, and understanding. However, the interaction with the presented data remains a key challenge. While most applications depend on hand tracking, we see eye tracking as a promising yet under-explored interaction modality, while challenges such as imprecision or inadvertently triggered actions need to be addressed. In this work, we explore the potential of eye gaze interaction for the visual exploration of trajectory data within an AR environment. We integrate hand- and eye-based interaction techniques specifically designed for three common use cases and address known eye tracking challenges. We refine our techniques and setup based on a pilot user study (n=6) and find in a follow-up study (n=20) that gaze interaction can compete with hand-tracked interaction regarding effectiveness, efficiency, and task load for selection and cluster exploration tasks. However, time step analysis comes with higher answer times and task load. In general, we find the results and preferences to be user-dependent. Our work contributes to the field of immersive data exploration, underscoring the need for continued research on eye tracking interaction.
Zusammenfassung in einer weiteren Sprache
Fachgebiet (DDC)
Schlagwörter
Konferenz
Rezension
Zitieren
ISO 690
JOOS, Lucas, Karsten KLEIN, Maximilian T. FISCHER, Frederik L. DENNIG, Daniel A. KEIM, Michael KRONE, 2023. Exploring Trajectory Data in Augmented Reality : A Comparative Study of Interaction Modalities. 2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). Sydney, Australia, 16. Okt. 2023 - 20. Okt. 2023. In: 2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). Piscataway, NJ: IEEE, 2023, S. 790-799. ISBN 979-8-3503-2838-7. Verfügbar unter: doi: 10.1109/ismar59233.2023.00094BibTex
@inproceedings{Joos2023-10-16Explo-68930,
title={Exploring Trajectory Data in Augmented Reality : A Comparative Study of Interaction Modalities},
year={2023},
doi={10.1109/ismar59233.2023.00094},
isbn={979-8-3503-2838-7},
address={Piscataway, NJ},
publisher={IEEE},
booktitle={2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)},
pages={790--799},
author={Joos, Lucas and Klein, Karsten and Fischer, Maximilian T. and Dennig, Frederik L. and Keim, Daniel A. and Krone, Michael}
}RDF
<rdf:RDF
xmlns:dcterms="http://purl.org/dc/terms/"
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:bibo="http://purl.org/ontology/bibo/"
xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#"
xmlns:foaf="http://xmlns.com/foaf/0.1/"
xmlns:void="http://rdfs.org/ns/void#"
xmlns:xsd="http://www.w3.org/2001/XMLSchema#">
<rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/68930">
<dc:contributor>Joos, Lucas</dc:contributor>
<bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/68930"/>
<dspace:hasBitstream rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/68930/1/Joos_2-a4pa3isd8snc9.pdf"/>
<dcterms:hasPart rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/68930/1/Joos_2-a4pa3isd8snc9.pdf"/>
<dcterms:issued>2023-10-16</dcterms:issued>
<dc:creator>Keim, Daniel A.</dc:creator>
<dc:creator>Joos, Lucas</dc:creator>
<dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime"
>2024-01-04T13:01:56Z</dc:date>
<dcterms:title>Exploring Trajectory Data in Augmented Reality : A Comparative Study of Interaction Modalities</dcterms:title>
<dc:language>eng</dc:language>
<dc:creator>Dennig, Frederik L.</dc:creator>
<dc:contributor>Fischer, Maximilian T.</dc:contributor>
<dcterms:rights rdf:resource="https://rightsstatements.org/page/InC/1.0/"/>
<void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/>
<dc:contributor>Krone, Michael</dc:contributor>
<dc:contributor>Keim, Daniel A.</dc:contributor>
<dc:creator>Klein, Karsten</dc:creator>
<dc:contributor>Klein, Karsten</dc:contributor>
<foaf:homepage rdf:resource="http://localhost:8080/"/>
<dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime"
>2024-01-04T13:01:56Z</dcterms:available>
<dc:creator>Fischer, Maximilian T.</dc:creator>
<dcterms:abstract>The visual exploration of trajectory data is crucial in domains such as animal behavior, molecular dynamics, and transportation. With the emergence of immersive technology, trajectory data, which is often inherently three-dimensional, can be analyzed in stereoscopic 3D, providing new opportunities for perception, engagement, and understanding. However, the interaction with the presented data remains a key challenge. While most applications depend on hand tracking, we see eye tracking as a promising yet under-explored interaction modality, while challenges such as imprecision or inadvertently triggered actions need to be addressed. In this work, we explore the potential of eye gaze interaction for the visual exploration of trajectory data within an AR environment. We integrate hand- and eye-based interaction techniques specifically designed for three common use cases and address known eye tracking challenges. We refine our techniques and setup based on a pilot user study (n=6) and find in a follow-up study (n=20) that gaze interaction can compete with hand-tracked interaction regarding effectiveness, efficiency, and task load for selection and cluster exploration tasks. However, time step analysis comes with higher answer times and task load. In general, we find the results and preferences to be user-dependent. Our work contributes to the field of immersive data exploration, underscoring the need for continued research on eye tracking interaction.</dcterms:abstract>
<dc:contributor>Dennig, Frederik L.</dc:contributor>
<dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/>
<dc:rights>terms-of-use</dc:rights>
<dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/>
<dc:creator>Krone, Michael</dc:creator>
</rdf:Description>
</rdf:RDF>