Shared and modality-specific brain regions that mediate auditory and visual word comprehension

  1. Anne Keitel  Is a corresponding author
  2. Joachim Gross
  3. Christoph Kayser
  1. Psychology, University of Dundee, United Kingdom
  2. Institute of Neuroscience and Psychology, University of Glasgow, United Kingdom
  3. Institute for Biomagnetism and Biosignalanalysis, University of Münster, Germany
  4. Department for Cognitive Neuroscience, Faculty of Biology, Bielefeld University, Germany
7 figures, 2 tables and 2 additional files

Figures

Figure 1 with 2 supplements
Trial structure and behavioural performance.

(A) Trial structure was identical in the auditory and visual conditions. Participants listened to stereotypical sentences while a fixation dot was presented (auditory condition) or watched videos of …

Figure 1—figure supplement 1
Explorative representational similarity analysis (RSA) of the behavioural data (n = 20).

Density plots show the distribution of within-participant correlations between behavioural representational dissimilarity matrices (RDMs) and RDMs obtained from phonological and semantic …

Figure 1—figure supplement 2
Data preparation and classification procedures.

Data preparation: Raw data were first de-noised and SQUID jumps were removed (preprocessing). Eye and heart artefacts and noisy channels were removed via visual inspection and ICA. Clean data were …

Figure 2 with 3 supplements
Word classification based on MEG activity regardless of behavioural performance (‘stimulus classification’).

Surface projections show areas with significant classification performance at the group level (n = 18; cluster-based permutation statistics, corrected at p<0.001 FWE). Results show strongest …

Figure 2—figure supplement 1
Whole-brain statistical maps for the comparison between auditory and visual word classification (Figure 2).

(A) Results of a cluster-based permutation analysis (n = 18; 3000 within-subject permutations, corrected at p<0.05 FWE). Shown are only those grid points that exhibit significant word classification …

Figure 2—figure supplement 2
Results of the audiovisual condition.

(A) Behavioural performance of all 20 participants. Scaling of the figure is identical to Figure 2. Dots represent individual participants, boxes denote median and interquartile ranges, whiskers …

Figure 2—figure supplement 3
Cross-classification between auditory, visual conditions and audiovisual conditions (n = 18).

(A) Results of a group-level t-test based on cluster-based permutation. Left panel: No significant cross-classification performance between the auditory and visual conditions was found (n = 18; …

Figure 3 with 2 supplements
Cortical areas in which neural word representations predict participants’ response.

Coloured areas denote significant group-level effects (surface projection of the cluster-based permutation statistics, corrected at p<0.05 FWE). (A) In the auditory condition (n = 18), we found five …

Figure 3—figure supplement 1
Whole-brain statistical maps for the comparison between auditory and visual neurobehavioural prediction (Figure 3).

(A) Results of a cluster-based permutation analysis (n = 18; 3000 within-subject permutations, corrected at p<0.05 FWE). Shown are only those grid points that exhibit significant word classification …

Figure 3—figure supplement 2
Correlations between word classification and behavioural indices.

(A) Surface projection of rho-values from correlations between neural classification and behaviour. No significant clusters were found at an alpha-level of 0.05, supporting that stimulus …

Largely distinct regions provide strong stimulus classification and mediate behavioural relevance.

(A) Areas with significant stimulus classification (from Figure 2) are shown in yellow, those with significant neuro-behavioural results (from Figure 3) in green, and the overlap in blue. The …

Author response image 1
Author response image 2
Word classification performance with and without a spatial searchlight.

Top panel: original results including a 1.2-cm searchlight (as in Figure 2 in the manuscript). Middle panel: classification results without searchlight. Bottom panel: Bayes factors of a group-level …

Author response image 3
Classification performance and neurobehavioural prediction over time.

Top panels represent cumulative whole-brain histograms of the significant grid points across all 7 epochs and bottom panels represent the epoch-specific number of grid points that are significant in …

Tables

Table 1
Peak effects of stimulus classification performance based on MEG activity.

Labels are taken from the AAL atlas (Tzourio-Mazoyer et al., 2002). For each peak, MNI coordinates, and classification performance (mean and SEM) are presented. Chance level for classification was …

Atlas labelMNI coordinatesClassification % (SEM)
XYZ
Auditory peaks
Rolandic Oper R (RO)41 –142028.89 (0.78)
Postcentral L (POST)−48–212529.04 (1.00)
Visual peaks
Calcarine L (OCC)−5–101−733.92 (1.53)
Frontal Inf Tri L (IFG)−4823126.70 (0.83)
Postcentral L (POST)−51–244726.85 (1.02)
Peak of overlap
Postcentral L (POST)−47–155226.50 (0.67)
Table 2
Peak effects for the neuro-behavioural analysis.

Labels are taken from the AAL atlas (Tzourio-Mazoyer et al., 2002). For each local peak, MNI coordinates, regression beta (mean and SEM across participants) and corresponding t-value are presented. …

Atlas labelMNI coordinatesBeta (SEM)t-value
XYZ
Auditory
Temporal Inf L (ITG)−41– 23−260.106 (0.024)4.40
Frontal Inf Orb L (IFG)−2825–90.082 (0.031)2.66
Occipital Mid L, Occipital Inf L (MOG)−46–83−40.079 (0.029)2.75
Supp Motor Area R (SMA)311520.089 (0.027)3.33
Angular R (AG)49–67400.079 (0.027)2.87
Visual
Frontal Inf Tri L (IFG)−573040.075 (0.017)4.34
Frontal Sup Medial R, Cingulum Ant R (SFG)947150.080 (0.028)2.86
Temporal Sup R (STG)38–30100.086 (0.023)3.77
Angular R (AG)60–55340.073 (0.020)3.55

Additional files

Supplementary file 1

Target words used in this study (9 adjectives and nine numbers, each presented in 10 different sentences).

Note that adjectives were comparable with regard to their positive valence (Scott et al., 2019).

https://cdn.elifesciences.org/articles/56972/elife-56972-supp1-v2.docx
Transparent reporting form
https://cdn.elifesciences.org/articles/56972/elife-56972-transrepform-v2.docx

Download links