Neural representation of abstract task structure during generalization
Abstract
Cognitive models in psychology and neuroscience widely assume that the human brain maintains an abstract representation of tasks. This assumption is fundamental to theories explaining how we learn quickly, think creatively, and act flexibly. However, neural evidence for a verifiably generative abstract task representation has been lacking. Here, we report an experimental paradigm that requires forming such a representation to act adaptively in novel conditions without feedback. Using functional magnetic resonance imaging, we observed that abstract task structure was represented within left mid-lateral prefrontal cortex, bilateral precuneus and inferior parietal cortex. These results provide support for the neural instantiation of the long-supposed abstract task representation in a setting where we can verify its influence. Such a representation can afford massive expansions of behavioral flexibility without additional experience, a vital characteristic of human cognition.
Data availability
Complete behavioral data from all participants who completed all three sessions of this experiment, un-thresholded statistical maps for whole-brain analyses and beta coefficients for ROI-level analyses have been deposited on the project site for this experiment on the Open Science Framework.
Article and author information
Author details
Funding
Office of Naval Research (N00014-16-1-2832)
- David Badre
National Institute of General Medical Sciences (R25GM125500)
- Johanny Castillo
- David Badre
National Institute of Mental Health (F32MH116592)
- Avinash Rao Vaidya
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Ethics
Human subjects: All participants gave their written informed consent to participate in this study, as approved by the Human Research Protections Office at Brown University, and were compensated for their participation.
Copyright
© 2021, Vaidya et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 3,925
- views
-
- 643
- downloads
-
- 43
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
When observing others’ behaviors, we continuously integrate their movements with the corresponding sounds to enhance perception and develop adaptive responses. However, how the human brain integrates these complex audiovisual cues based on their natural temporal correspondence remains unclear. Using electroencephalogram (EEG), we demonstrated that rhythmic cortical activity tracked the hierarchical rhythmic structures in audiovisually congruent human walking movements and footstep sounds. Remarkably, the cortical tracking effects exhibit distinct multisensory integration modes at two temporal scales: an additive mode in a lower-order, narrower temporal integration window (step cycle) and a super-additive enhancement in a higher-order, broader temporal window (gait cycle). Furthermore, while neural responses at the lower-order timescale reflect a domain-general audiovisual integration process, cortical tracking at the higher-order timescale is exclusively engaged in the integration of biological motion cues. In addition, only this higher-order, domain-specific cortical tracking effect correlates with individuals’ autistic traits, highlighting its potential as a neural marker for autism spectrum disorder. These findings unveil the multifaceted mechanism whereby rhythmic cortical activity supports the multisensory integration of human motion, shedding light on how neural coding of hierarchical temporal structures orchestrates the processing of complex, natural stimuli across multiple timescales.
-
- Neuroscience
Early-life stress can have lifelong consequences, enhancing stress susceptibility and resulting in behavioural and cognitive deficits. While the effects of early-life stress on neuronal function have been well-described, we still know very little about the contribution of non-neuronal brain cells. Investigating the complex interactions between distinct brain cell types is critical to fully understand how cellular changes manifest as behavioural deficits following early-life stress. Here, using male and female mice we report that early-life stress induces anxiety-like behaviour and fear generalisation in an amygdala-dependent learning and memory task. These behavioural changes were associated with impaired synaptic plasticity, increased neural excitability, and astrocyte hypofunction. Genetic perturbation of amygdala astrocyte function by either reducing astrocyte calcium activity or reducing astrocyte network function was sufficient to replicate cellular, synaptic, and fear memory generalisation associated with early-life stress. Our data reveal a role of astrocytes in tuning emotionally salient memory and provide mechanistic links between early-life stress, astrocyte hypofunction, and behavioural deficits.