Publikation:

Exploring Trajectory Data in Augmented Reality : A Comparative Study of Interaction Modalities

Lade...
Vorschaubild

Dateien

Zu diesem Dokument gibt es keine Dateien.

Datum

2023

Herausgeber:innen

Kontakt

ISSN der Zeitschrift

Electronic ISSN

ISBN

Bibliografische Daten

Verlag

Schriftenreihe

Auflagebezeichnung

URI (zitierfähiger Link)
ArXiv-ID

Internationale Patentnummer

Angaben zur Forschungsförderung

Projekt

Open Access-Veröffentlichung
Core Facility der Universität Konstanz

Gesperrt bis

Titel in einer weiteren Sprache

Publikationstyp
Beitrag zu einem Konferenzband
Publikationsstatus
Published

Erschienen in

2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). Piscataway, NJ: IEEE, 2023, pp. 790-799. ISBN 979-8-3503-2838-7. Available under: doi: 10.1109/ismar59233.2023.00094

Zusammenfassung

The visual exploration of trajectory data is crucial in domains such as animal behavior, molecular dynamics, and transportation. With the emergence of immersive technology, trajectory data, which is often inherently three-dimensional, can be analyzed in stereoscopic 3D, providing new opportunities for perception, engagement, and understanding. However, the interaction with the presented data remains a key challenge. While most applications depend on hand tracking, we see eye tracking as a promising yet under-explored interaction modality, while challenges such as imprecision or inadvertently triggered actions need to be addressed. In this work, we explore the potential of eye gaze interaction for the visual exploration of trajectory data within an AR environment. We integrate hand- and eye-based interaction techniques specifically designed for three common use cases and address known eye tracking challenges. We refine our techniques and setup based on a pilot user study (n=6) and find in a follow-up study (n=20) that gaze interaction can compete with hand-tracked interaction regarding effectiveness, efficiency, and task load for selection and cluster exploration tasks. However, time step analysis comes with higher answer times and task load. In general, we find the results and preferences to be user-dependent. Our work contributes to the field of immersive data exploration, underscoring the need for continued research on eye tracking interaction.

Zusammenfassung in einer weiteren Sprache

Fachgebiet (DDC)
004 Informatik

Schlagwörter

Augmented reality, trajectories, gaze interaction, eye tracking, user evaluation, data analysis, immersive analytics, Human-centered computing, Human computer interaction (HCI), Interaction paradigms, Mixed/augmented reality

Konferenz

2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 16. Okt. 2023 - 20. Okt. 2023, Sydney, Australia
Rezension
undefined / . - undefined, undefined

Forschungsvorhaben

Organisationseinheiten

Zeitschriftenheft

Verknüpfte Datensätze

Zitieren

ISO 690JOOS, Lucas, Karsten KLEIN, Maximilian T. FISCHER, Frederik L. DENNIG, Daniel A. KEIM, Michael KRONE, 2023. Exploring Trajectory Data in Augmented Reality : A Comparative Study of Interaction Modalities. 2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). Sydney, Australia, 16. Okt. 2023 - 20. Okt. 2023. In: 2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). Piscataway, NJ: IEEE, 2023, pp. 790-799. ISBN 979-8-3503-2838-7. Available under: doi: 10.1109/ismar59233.2023.00094
BibTex
@inproceedings{Joos2023-10-16Explo-68930,
  year={2023},
  doi={10.1109/ismar59233.2023.00094},
  title={Exploring Trajectory Data in Augmented Reality : A Comparative Study of Interaction Modalities},
  isbn={979-8-3503-2838-7},
  publisher={IEEE},
  address={Piscataway, NJ},
  booktitle={2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)},
  pages={790--799},
  author={Joos, Lucas and Klein, Karsten and Fischer, Maximilian T. and Dennig, Frederik L. and Keim, Daniel A. and Krone, Michael}
}
RDF
<rdf:RDF
    xmlns:dcterms="http://purl.org/dc/terms/"
    xmlns:dc="http://purl.org/dc/elements/1.1/"
    xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
    xmlns:bibo="http://purl.org/ontology/bibo/"
    xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#"
    xmlns:foaf="http://xmlns.com/foaf/0.1/"
    xmlns:void="http://rdfs.org/ns/void#"
    xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > 
  <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/68930">
    <dc:creator>Krone, Michael</dc:creator>
    <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/>
    <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/36"/>
    <dc:contributor>Dennig, Frederik L.</dc:contributor>
    <dcterms:abstract>The visual exploration of trajectory data is crucial in domains such as animal behavior, molecular dynamics, and transportation. With the emergence of immersive technology, trajectory data, which is often inherently three-dimensional, can be analyzed in stereoscopic 3D, providing new opportunities for perception, engagement, and understanding. However, the interaction with the presented data remains a key challenge. While most applications depend on hand tracking, we see eye tracking as a promising yet under-explored interaction modality, while challenges such as imprecision or inadvertently triggered actions need to be addressed. In this work, we explore the potential of eye gaze interaction for the visual exploration of trajectory data within an AR environment. We integrate hand- and eye-based interaction techniques specifically designed for three common use cases and address known eye tracking challenges. We refine our techniques and setup based on a pilot user study (n=6) and find in a follow-up study (n=20) that gaze interaction can compete with hand-tracked interaction regarding effectiveness, efficiency, and task load for selection and cluster exploration tasks. However, time step analysis comes with higher answer times and task load. In general, we find the results and preferences to be user-dependent. Our work contributes to the field of immersive data exploration, underscoring the need for continued research on eye tracking interaction.</dcterms:abstract>
    <dc:creator>Fischer, Maximilian T.</dc:creator>
    <foaf:homepage rdf:resource="http://localhost:8080/"/>
    <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2024-01-04T13:01:56Z</dcterms:available>
    <dc:contributor>Klein, Karsten</dc:contributor>
    <dc:creator>Klein, Karsten</dc:creator>
    <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2024-01-04T13:01:56Z</dc:date>
    <dc:contributor>Keim, Daniel A.</dc:contributor>
    <dc:contributor>Krone, Michael</dc:contributor>
    <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/>
    <dc:contributor>Fischer, Maximilian T.</dc:contributor>
    <dc:creator>Dennig, Frederik L.</dc:creator>
    <dc:language>eng</dc:language>
    <dcterms:title>Exploring Trajectory Data in Augmented Reality : A Comparative Study of Interaction Modalities</dcterms:title>
    <dc:creator>Joos, Lucas</dc:creator>
    <dc:creator>Keim, Daniel A.</dc:creator>
    <dcterms:issued>2023-10-16</dcterms:issued>
    <bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/68930"/>
    <dc:contributor>Joos, Lucas</dc:contributor>
  </rdf:Description>
</rdf:RDF>

Interner Vermerk

xmlui.Submission.submit.DescribeStep.inputForms.label.kops_note_fromSubmitter

Kontakt
URL der Originalveröffentl.

Prüfdatum der URL

Prüfungsdatum der Dissertation

Finanzierungsart

Kommentar zur Publikation

Allianzlizenz
Corresponding Authors der Uni Konstanz vorhanden
Internationale Co-Autor:innen
Universitätsbibliographie
Ja
Begutachtet
Diese Publikation teilen