The representational dynamics of task and object processing in humans

  1. Martin N Hebart  Is a corresponding author
  2. Brett B Bankson
  3. Assaf Harel
  4. Chris I Baker
  5. Radoslaw M Cichy
  1. National Institute of Mental Health, National Institutes of Health, United States
  2. Wright State University, United States
  3. Free University of Berlin, Germany
  4. Humboldt Universität zu Berlin, Germany
  5. Charité Universitätsmedizin, Germany
8 figures and 2 additional files

Figures

Experimental paradigm.

On each trial (Procedure depicted in Panel C), participants were presented with a stimulus from one of eight different object classes (Panel B) embedded in one of four task contexts (Panel A, top) …

https://doi.org/10.7554/eLife.32816.002
Schematic for multivariate analyses of MEG data.

All multivariate analyses were carried out in a time-resolved manner on principal components (PCs) based on MEG sensor patterns (see Materials and methods for transformation of sensor patterns to …

https://doi.org/10.7554/eLife.32816.003
Time-resolved MEG decoding of task and objects across the trial.

After onset of the task cue (Task Cue Period), task-related accuracy increased rapidly, followed by a decay toward chance and significant above-chance decoding ~200 ms prior to object onset. After …

https://doi.org/10.7554/eLife.32816.004
Figure 3—source data 1

Per subject time courses of mean classification accuracy for task and object.

https://doi.org/10.7554/eLife.32816.005
Figure 4 with 2 supplements
Results of temporal generalization analysis of task.

(A) Temporal cross-classification matrix. The y-axis reflects the classifier training time relative to task cue onset, the x-axis the classifier generalization time, and the color codes the …

https://doi.org/10.7554/eLife.32816.006
Figure 4—source data 1

Per subject matrices of temporal cross-classification analysis of task.

https://doi.org/10.7554/eLife.32816.009
Figure 4—figure supplement 1
Results of temporal generalization analysis of task separated by task type.

Each map reflects the average of all pairwise classifications of a given task with all other tasks (e.g. color vs. tilt, color vs. content, color vs. size). The y-axis reflects the classifier …

https://doi.org/10.7554/eLife.32816.007
Figure 4—figure supplement 2
Results of temporal generalization analysis of objects.

The y-axis reflects the classifier training time relative to task cue onset, the x-axis the classifier generalization time, and the color codes the cross-classification accuracy for each combination …

https://doi.org/10.7554/eLife.32816.008
Comparison of object decoding for different task types (p<0.05, cluster-corrected sign permutation test).

Error bars reflect standard error of the difference of the means. (A) Object decoding separated by perceptual and conceptual task types. Initially, object decoding for conceptual and perceptual …

https://doi.org/10.7554/eLife.32816.010
Figure 5—source data 1

Per subject time courses of mean classification accuracy for task separated by task type and cross-classification accuracy between task types.

https://doi.org/10.7554/eLife.32816.011
Figure 6 with 2 supplements
Model-based MEG-fMRI fusion procedure and results.

(A) Model-based MEG-fMRI fusion in the current formulation reflects the shared variance (commonality) between three dissimilarity matrices: (1) an fMRI RDM generated from voxel patterns of a given …

https://doi.org/10.7554/eLife.32816.012
Figure 6—source data 1

Mean representational dissimilarity matrices for all combinations of task and object, both for all five fMRI ROIs and each MEG time point, including pre-calculated permutations used for permutation testing.

https://doi.org/10.7554/eLife.32816.014
Figure 6—figure supplement 1
FMRI representational dissimilarity matrices (RDMs) for the five regions of interest: Posterior parietal cortex (PPC), lateral prefrontal cortex (lPFC), early visual cortex (EVC), object-selective lateral occipital cortex (LO), and posterior fusiform sulcus (pFS).

Since RDMs are compared to MEG data using Spearman r, rank-transformed dissimilarities are plotted.

https://doi.org/10.7554/eLife.32816.013
Figure 6–video 1
Movie of time-resolved MEG representational dissimilarity matrices, scaled using the rank transform across dissimilarities.
https://doi.org/10.7554/eLife.32816.015

Additional files

Source code 1

Matlab scripts including helper functions to produce Figures 36 based on available source data.

https://doi.org/10.7554/eLife.32816.016
Transparent reporting form
https://doi.org/10.7554/eLife.32816.017

Download links