Publikation:

Cross-modal decoding of emotional expressions in fMRI : cross-session and cross-sample replication

Lade...
Vorschaubild

Dateien

Wallenwein_2-dx4bzmwsolb63.pdf
Wallenwein_2-dx4bzmwsolb63.pdfGröße: 3.93 MBDownloads: 14

Datum

2024

Herausgeber:innen

Kontakt

ISSN der Zeitschrift

Electronic ISSN

ISBN

Bibliografische Daten

Verlag

Schriftenreihe

Auflagebezeichnung

ArXiv-ID

Internationale Patentnummer

Link zur Lizenz

Angaben zur Forschungsförderung

Projekt

Open Access-Veröffentlichung
Open Access Gold
Core Facility der Universität Konstanz

Gesperrt bis

Titel in einer weiteren Sprache

Publikationstyp
Zeitschriftenartikel
Publikationsstatus
Published

Erschienen in

Imaging Neuroscience. MIT Press. 2024, 2, S. 1-15. eISSN 2837-6056. Verfügbar unter: doi: 10.1162/imag_a_00289

Zusammenfassung

The theory of embodied simulation suggests a common neuronal representation for action and perception in mirror neurons (MN) that allows an automatic understanding of another person’s mental state. Multivariate pattern analysis (MVPA) of fMRI data enables a joint investigation of the MN properties cross-modality and action specificity with high spatial sensitivity. In repeated measures and independent samples, we measured BOLD-fMRI activation during a social-cognitive paradigm, which included the imitation, execution, and observation of a facial expression of fear or anger. Using support vector machines in a region of interest and a searchlight-based within-subject approach, we classified the emotional content first within modalities and subsequently across modalities. Of main interest were regions of the MN and the emotional face processing system. A two-step permutation scheme served to evaluate significance of classification accuracies. Additionally, we analyzed cross-session and cross-sample replicability. Classification of emotional content was significantly above chance within-modality in the execution and imitation condition with replication across session and across samples, but not in the observation condition. Cross-modal classification was possible when trained on the execution condition and tested on the imitation condition with cross-session replication. The searchlight analysis revealed additional areas exhibiting action specificity and cross-modality, mainly in the prefrontal cortex. We demonstrate replicability of brain regions with action specific and cross-modal representations of fear and anger for execution and imitation. Since we could not find a shared neural representation of emotions within the observation modality, our results only partially lend support to the embodied simulation theory. We conclude that activation in MN regions is less robust and less clearly distinguishable during observation than motor tasks.

Zusammenfassung in einer weiteren Sprache

Fachgebiet (DDC)
150 Psychologie

Schlagwörter

mirror neuron system, machine learning, multivariate pattern analysis, social cognition, imitation, faces

Konferenz

Rezension
undefined / . - undefined, undefined

Forschungsvorhaben

Organisationseinheiten

Zeitschriftenheft

Zugehörige Datensätze in KOPS

Zitieren

ISO 690WALLENWEIN, Lara A., Stephanie N. L. SCHMIDT, Joachim HASS, Daniela MIER, 2024. Cross-modal decoding of emotional expressions in fMRI : cross-session and cross-sample replication. In: Imaging Neuroscience. MIT Press. 2024, 2, S. 1-15. eISSN 2837-6056. Verfügbar unter: doi: 10.1162/imag_a_00289
BibTex
@article{Wallenwein2024-09-23Cross-70824,
  year={2024},
  doi={10.1162/imag_a_00289},
  title={Cross-modal decoding of emotional expressions in fMRI : cross-session and cross-sample replication},
  volume={2},
  journal={Imaging Neuroscience},
  pages={1--15},
  author={Wallenwein, Lara A. and Schmidt, Stephanie N. L. and Hass, Joachim and Mier, Daniela}
}
RDF
<rdf:RDF
    xmlns:dcterms="http://purl.org/dc/terms/"
    xmlns:dc="http://purl.org/dc/elements/1.1/"
    xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
    xmlns:bibo="http://purl.org/ontology/bibo/"
    xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#"
    xmlns:foaf="http://xmlns.com/foaf/0.1/"
    xmlns:void="http://rdfs.org/ns/void#"
    xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > 
  <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/70824">
    <foaf:homepage rdf:resource="http://localhost:8080/"/>
    <dc:language>eng</dc:language>
    <dc:contributor>Hass, Joachim</dc:contributor>
    <dcterms:hasPart rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/70824/1/Wallenwein_2-dx4bzmwsolb63.pdf"/>
    <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/>
    <dc:rights>Attribution 4.0 International</dc:rights>
    <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2024-09-19T09:42:22Z</dcterms:available>
    <dcterms:abstract>The theory of embodied simulation suggests a common neuronal representation for action and perception in mirror neurons (MN) that allows an automatic understanding of another person’s mental state. Multivariate pattern analysis (MVPA) of fMRI data enables a joint investigation of the MN properties cross-modality and action specificity with high spatial sensitivity. In repeated measures and independent samples, we measured BOLD-fMRI activation during a social-cognitive paradigm, which included the imitation, execution, and observation of a facial expression of fear or anger. Using support vector machines in a region of interest and a searchlight-based within-subject approach, we classified the emotional content first within modalities and subsequently across modalities. Of main interest were regions of the MN and the emotional face processing system. A two-step permutation scheme served to evaluate significance of classification accuracies. Additionally, we analyzed cross-session and cross-sample replicability. Classification of emotional content was significantly above chance within-modality in the execution and imitation condition with replication across session and across samples, but not in the observation condition. Cross-modal classification was possible when trained on the execution condition and tested on the imitation condition with cross-session replication. The searchlight analysis revealed additional areas exhibiting action specificity and cross-modality, mainly in the prefrontal cortex. We demonstrate replicability of brain regions with action specific and cross-modal representations of fear and anger for execution and imitation. Since we could not find a shared neural representation of emotions within the observation modality, our results only partially lend support to the embodied simulation theory. We conclude that activation in MN regions is less robust and less clearly distinguishable during observation than motor tasks.</dcterms:abstract>
    <dcterms:issued>2024-09-23</dcterms:issued>
    <dc:creator>Hass, Joachim</dc:creator>
    <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2024-09-19T09:42:22Z</dc:date>
    <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/43"/>
    <dc:creator>Mier, Daniela</dc:creator>
    <dc:contributor>Schmidt, Stephanie N. L.</dc:contributor>
    <dcterms:title>Cross-modal decoding of emotional expressions in fMRI : cross-session and cross-sample replication</dcterms:title>
    <bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/70824"/>
    <dspace:hasBitstream rdf:resource="https://kops.uni-konstanz.de/bitstream/123456789/70824/1/Wallenwein_2-dx4bzmwsolb63.pdf"/>
    <dc:creator>Schmidt, Stephanie N. L.</dc:creator>
    <dc:creator>Wallenwein, Lara A.</dc:creator>
    <dc:contributor>Wallenwein, Lara A.</dc:contributor>
    <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/43"/>
    <dc:contributor>Mier, Daniela</dc:contributor>
    <dcterms:rights rdf:resource="http://creativecommons.org/licenses/by/4.0/"/>
  </rdf:Description>
</rdf:RDF>

Interner Vermerk

xmlui.Submission.submit.DescribeStep.inputForms.label.kops_note_fromSubmitter

Kontakt
URL der Originalveröffentl.

Prüfdatum der URL

Prüfungsdatum der Dissertation

Finanzierungsart

Kommentar zur Publikation

Allianzlizenz
Corresponding Authors der Uni Konstanz vorhanden
Internationale Co-Autor:innen
Universitätsbibliographie
Ja
Begutachtet
Ja
Diese Publikation teilen