Neuropsychological evidence of multi-domain network hubs in the human thalamus
Abstract
Hubs in the human brain support behaviors that arise from brain network interactions. Previous studies have identified hub regions in the human thalamus that are connected with multiple functional networks. However, the behavioral significance of thalamic hubs has yet to be established. Our framework predicts that thalamic subregions with strong hub properties are broadly involved in functions across multiple cognitive domains. To test this prediction, we studied human patients with focal thalamic lesions in conjunction with network analyses of the human thalamocortical functional connectome. In support of our prediction, lesions to thalamic subregions with stronger hub properties were associated with widespread deficits in executive, language, and memory functions, whereas lesions to thalamic subregions with weaker hub properties were associated with more limited deficits. These results highlight how a large-scale network model can broaden our understanding of thalamic function for human cognition.
Data availability
We have made all code and lesion-derived measures used in the manuscript freely available on github (https://github.com/kaihwang/LTH), including neuropsych assessment outcome, derivatives from lesion analyses, data used for functional connectivity analyses, and mRNA expression analyses. Functional connectivity analyses utilized publicly available datasets (Holmes et al., 2015; Nooner et al., 2012). The only data that we cannot post without restrictions are each patient's clinical MRI data and lesion data. Patients were enrolled into the Iowa Lesion Patient Registry the past few decades, and most did not consent to post their clinical MRI data publicly. To gain access to those data, the interested party will have to contact the PI of the lesion registry, Dr. Dan Tranel, and the corresponding author of this project, Dr. Kai Hwang. The user will require to sign a data use agreement. This institutional policy was designed to ensure the appropriate use of the data for academic and not commercial purposes. A study plan of the proposed research will have to be submitted, and we will work with the interested party to obtain the necessary IRB approval from both institutions.
-
Brain Genomics Superstruct ProjectBrain Genomics Superstruct Project initial data release with structural, functional, and behavioral measures.
-
NKI-Rockland sampleThe enhanced Nathan Kline Institute-Rockland Sample (NKI-RS).
Article and author information
Author details
Funding
National Institutes of Health (R01MH122613)
- Kai Hwang
- Daniel Tranel
- Aaron Boes
National Institutes of Health (RO1MH117772)
- James M Shine
National Institutes of Health (P50MH094258)
- Daniel Tranel
Kiwanis Neuroscience Research Foundation
- Daniel Tranel
National Institutes of Health (R01NS114405)
- Aaron Boes
National Institutes of Health (R21MH120441)
- Aaron Boes
National Health and Medical Research Council (GNT1156536)
- James M Shine
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Ethics
Human subjects: All participants gave written informed consent, and the study was approved by the University of Iowa Institutional Review Board (protocol #200105018).
Copyright
© 2021, Hwang et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 1,666
- views
-
- 286
- downloads
-
- 36
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
Recognizing goal-directed actions is a computationally challenging task, requiring not only the visual analysis of body movements, but also analysis of how these movements causally impact, and thereby induce a change in, those objects targeted by an action. We tested the hypothesis that the analysis of body movements and the effects they induce relies on distinct neural representations in superior and anterior inferior parietal lobe (SPL and aIPL). In four fMRI sessions, participants observed videos of actions (e.g. breaking stick, squashing plastic bottle) along with corresponding point-light-display (PLD) stick figures, pantomimes, and abstract animations of agent–object interactions (e.g. dividing or compressing a circle). Cross-decoding between actions and animations revealed that aIPL encodes abstract representations of action effect structures independent of motion and object identity. By contrast, cross-decoding between actions and PLDs revealed that SPL is disproportionally tuned to body movements independent of visible interactions with objects. Lateral occipitotemporal cortex (LOTC) was sensitive to both action effects and body movements. These results demonstrate that parietal cortex and LOTC are tuned to physical action features, such as how body parts move in space relative to each other and how body parts interact with objects to induce a change (e.g. in position or shape/configuration). The high level of abstraction revealed by cross-decoding suggests a general neural code supporting mechanical reasoning about how entities interact with, and have effects on, each other.
-
- Neuroscience
Our movements result in predictable sensory feedback that is often multimodal. Based on deviations between predictions and actual sensory input, primary sensory areas of cortex have been shown to compute sensorimotor prediction errors. How prediction errors in one sensory modality influence the computation of prediction errors in another modality is still unclear. To investigate multimodal prediction errors in mouse auditory cortex, we used a virtual environment to experimentally couple running to both self-generated auditory and visual feedback. Using two-photon microscopy, we first characterized responses of layer 2/3 (L2/3) neurons to sounds, visual stimuli, and running onsets and found responses to all three stimuli. Probing responses evoked by audiomotor (AM) mismatches, we found that they closely resemble visuomotor (VM) mismatch responses in visual cortex (V1). Finally, testing for cross modal influence on AM mismatch responses by coupling both sound amplitude and visual flow speed to the speed of running, we found that AM mismatch responses were amplified when paired with concurrent VM mismatches. Our results demonstrate that multimodal and non-hierarchical interactions shape prediction error responses in cortical L2/3.