Cortico-hippocampal network connections support the multidimensional quality of episodic memory
Episodic memories reflect a bound representation of multimodal features that can be reinstated with varying precision. Yet little is known about how brain networks involved in memory, including the hippocampus and posterior-medial (PM) and anterior-temporal (AT) systems, interact to support the quality and content of recollection. Participants learned color, spatial, and emotion associations of objects, later reconstructing the visual features using a continuous color spectrum and 360-degree panorama scenes. Behaviorally, dependencies in memory were observed for the gist but not precision of event associations. Supporting this integration, hippocampus, AT, and PM regions showed increased connectivity and reduced modularity during retrieval compared to encoding. These inter-network connections tracked a multidimensional, objective measure of memory quality. Moreover, distinct patterns of connectivity tracked item color and spatial memory precision. These findings demonstrate how hippocampal-cortical connections reconfigure during episodic retrieval, and how such dynamic interactions might flexibly support the multidimensional quality of remembered events.
Data and code have been made available via GitHub: https://github.com/memobc/paper-orbitfmri
A functional neuroimaging study of item and spatial context memory precisionGitHub, memobc/paper-orbitfmri.
Recognizing Scene Viewpoint using Panoramic Place Representationpeople.csail.mit.edu/jxiao, SUN360.
Article and author information
National Institutes of Health (R00MH103401)
- Maureen Ritchey
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Human subjects: Informed consent was obtained from all participants prior to the experiment. Procedures were approved by the Boston College Institutional Review Board (17.026).
- Muireann Irish, University of Sydney, Australia
- Received: January 28, 2019
- Accepted: March 22, 2019
- Accepted Manuscript published: March 22, 2019 (version 1)
- Version of Record published: April 5, 2019 (version 2)
© 2019, Cooper & Ritchey
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
- Page views
Article citation count generated by polling the highest count across the following sources: Scopus, Crossref, PubMed Central.
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
One signature of the human brain is its ability to derive knowledge from language inputs, in addition to nonlinguistic sensory channels such as vision and touch. How does human language experience modulate the mechanism by which semantic knowledge is stored in the human brain? We investigated this question using a unique human model with varying amounts and qualities of early language exposure: early deaf adults who were born to hearing parents and had reduced early exposure and delayed acquisition of any natural human language (speech or sign), with early deaf adults who acquired sign language from birth as the control group that matches on nonlinguistic sensory experiences. Neural responses in a semantic judgment task with 90 written words that were familiar to both groups were measured using fMRI. The deaf group with reduced early language exposure, compared with the deaf control group, showed reduced semantic sensitivity, in both multivariate pattern (semantic structure encoding) and univariate (abstractness effect) analyses, in the left dorsal anterior temporal lobe (dATL). These results provide positive, causal evidence that language experience drives the neural semantic representation in the dATL, highlighting the roles of language in forming human neural semantic structures beyond nonverbal sensory experiences.
Across phyla, males often produce species-specific vocalizations to attract females. Although understanding the neural mechanisms underlying behavior has been challenging in vertebrates, we previously identified two anatomically distinct central pattern generators (CPGs) that drive the fast and slow clicks of male Xenopus laevis, using an ex vivo preparation that produces fictive vocalizations. Here, we extended this approach to four additional species, X. amieti, X. cliivi, X. petersii, and X. tropicalis, by developing ex vivo brain preparation from which fictive vocalizations are elicited in response to a chemical or electrical stimulus. We found that even though the courtship calls are species-specific, the CPGs used to generate clicks are conserved across species. The fast CPGs, which critically rely on reciprocal connections between the parabrachial nucleus and the nucleus ambiguus, are conserved among fast-click species, and slow CPGs are shared among slow-click species. In addition, our results suggest that testosterone plays a role in organizing fast CPGs in fast-click species, but not in slow-click species. Moreover, fast CPGs are not inherited by all species but monopolized by fast-click species. The results suggest that species-specific calls of the genus Xenopus have evolved by utilizing conserved slow and/or fast CPGs inherited by each species.