Eye movements track prioritized auditory features in selective attention to natural speech
Dateien
Datum
Autor:innen
Herausgeber:innen
ISSN der Zeitschrift
Electronic ISSN
ISBN
Bibliografische Daten
Verlag
Schriftenreihe
Auflagebezeichnung
URI (zitierfähiger Link)
DOI (zitierfähiger Link)
Internationale Patentnummer
Link zur Lizenz
Angaben zur Forschungsförderung
Projekt
Open Access-Veröffentlichung
Sammlungen
Core Facility der Universität Konstanz
Titel in einer weiteren Sprache
Publikationstyp
Publikationsstatus
Erschienen in
Zusammenfassung
Over the last decades, cognitive neuroscience has identified a distributed set of brain regions that are critical for attention. Strong anatomical overlap with brain regions critical for oculomotor processes suggests a joint network for attention and eye movements. However, the role of this shared network in complex, naturalistic environments remains understudied. Here, we investigated eye movements in relation to (un)attended sentences of natural speech. Combining simultaneously recorded eye tracking and magnetoencephalographic data with temporal response functions, we show that gaze tracks attended speech, a phenomenon we termed ocular speech tracking. Ocular speech tracking even differentiates a target from a distractor in a multi-speaker context and is further related to intelligibility. Moreover, we provide evidence for its contribution to neural differences in speech processing, emphasizing the necessity to consider oculomotor activity in future research and in the interpretation of neural differences in auditory cognition.
Zusammenfassung in einer weiteren Sprache
Fachgebiet (DDC)
Schlagwörter
Konferenz
Rezension
Zitieren
ISO 690
GEHMACHER, Quirin, Juliane SCHUBERT, Fabian SCHMIDT, Thomas HARTMANN, Patrick REISINGER, Sebastian RÖSCH, Konrad SCHWARZ, Tzvetan G. POPOV, Maria CHAIT, Nathan WEISZ, 2024. Eye movements track prioritized auditory features in selective attention to natural speech. In: Nature Communications. Springer. 2024, 15(1), 3692. eISSN 2041-1723. Verfügbar unter: doi: 10.1038/s41467-024-48126-2BibTex
@article{Gehmacher2024-05-01movem-71248, year={2024}, doi={10.1038/s41467-024-48126-2}, title={Eye movements track prioritized auditory features in selective attention to natural speech}, number={1}, volume={15}, journal={Nature Communications}, author={Gehmacher, Quirin and Schubert, Juliane and Schmidt, Fabian and Hartmann, Thomas and Reisinger, Patrick and Rösch, Sebastian and Schwarz, Konrad and Popov, Tzvetan G. and Chait, Maria and Weisz, Nathan}, note={Article Number: 3692} }
RDF
<rdf:RDF xmlns:dcterms="http://purl.org/dc/terms/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:bibo="http://purl.org/ontology/bibo/" xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#" xmlns:foaf="http://xmlns.com/foaf/0.1/" xmlns:void="http://rdfs.org/ns/void#" xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/71248"> <dspace:hasBitstream rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/71248/4/Gehmacher_2-nur9tt6gm5j42.pdf"/> <dc:creator>Popov, Tzvetan G.</dc:creator> <dcterms:abstract>Over the last decades, cognitive neuroscience has identified a distributed set of brain regions that are critical for attention. Strong anatomical overlap with brain regions critical for oculomotor processes suggests a joint network for attention and eye movements. However, the role of this shared network in complex, naturalistic environments remains understudied. Here, we investigated eye movements in relation to (un)attended sentences of natural speech. Combining simultaneously recorded eye tracking and magnetoencephalographic data with temporal response functions, we show that gaze tracks attended speech, a phenomenon we termed ocular speech tracking. Ocular speech tracking even differentiates a target from a distractor in a multi-speaker context and is further related to intelligibility. Moreover, we provide evidence for its contribution to neural differences in speech processing, emphasizing the necessity to consider oculomotor activity in future research and in the interpretation of neural differences in auditory cognition.</dcterms:abstract> <dc:contributor>Reisinger, Patrick</dc:contributor> <bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/71248"/> <dc:creator>Hartmann, Thomas</dc:creator> <dc:contributor>Schubert, Juliane</dc:contributor> <dc:language>eng</dc:language> <dc:creator>Rösch, Sebastian</dc:creator> <dc:contributor>Schwarz, Konrad</dc:contributor> <dc:creator>Gehmacher, Quirin</dc:creator> <dc:rights>Attribution 4.0 International</dc:rights> <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/43"/> <dc:creator>Chait, Maria</dc:creator> <dc:contributor>Rösch, Sebastian</dc:contributor> <dc:creator>Schwarz, Konrad</dc:creator> <dc:contributor>Gehmacher, Quirin</dc:contributor> <dcterms:issued>2024-05-01</dcterms:issued> <foaf:homepage rdf:resource="http://localhost:8080/"/> <dc:contributor>Popov, Tzvetan G.</dc:contributor> <dcterms:hasPart rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/71248/4/Gehmacher_2-nur9tt6gm5j42.pdf"/> <dc:contributor>Schmidt, Fabian</dc:contributor> <dc:creator>Reisinger, Patrick</dc:creator> <dc:creator>Schmidt, Fabian</dc:creator> <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/43"/> <dc:contributor>Chait, Maria</dc:contributor> <dc:creator>Schubert, Juliane</dc:creator> <dcterms:rights rdf:resource="http://creativecommons.org/licenses/by/4.0/"/> <dc:contributor>Hartmann, Thomas</dc:contributor> <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/> <dc:contributor>Weisz, Nathan</dc:contributor> <dc:creator>Weisz, Nathan</dc:creator> <dcterms:title>Eye movements track prioritized auditory features in selective attention to natural speech</dcterms:title> <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2024-11-13T10:47:23Z</dc:date> <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2024-11-13T10:47:23Z</dcterms:available> </rdf:Description> </rdf:RDF>