Hippocampal sharp wave-ripples and the associated sequence replay emerge from structured synaptic interactions in a network model of area CA3
Abstract
Hippocampal place cells are activated sequentially as an animal explores its environment. These activity sequences are internally recreated ('replayed'), either in the same or reversed order, during bursts of activity (sharp wave-ripples; SWRs) that occur in sleep and awake rest. SWR-associated replay is thought to be critical for the creation and maintenance of long-term memory. In order to identify the cellular and network mechanisms of SWRs and replay, we constructed and simulated a data-driven model of area CA3 of the hippocampus. Our results show that the chain-like structure of recurrent excitatory interactions established during learning not only determines the content of replay, but is essential for the generation of the SWRs as well. We find that bidirectional replay requires the interplay of the experimentally confirmed, temporally symmetric plasticity rule, and cellular adaptation. Our model provides a unifying framework for diverse phenomena involving hippocampal plasticity, representations, and dynamics, and suggests that the structured neural codes induced by learning may have greater influence over cortical network states than previously appreciated.
Data availability
The source code to build, run and analyze our model is publicly available on GitHub: https://github.com/KaliLab/ca3net
Article and author information
Author details
Funding
Hungarian Scientific Research Fund (K83251)
- Maria Rita Karlocai
- Attila I Gulyás
- Szabolcs Káli
Hungarian Scientific Research Fund (K85659)
- Orsolya Steinbach-Németh
- Norbert Hájos
Hungarian Scientific Research Fund (K115441)
- Attila I Gulyás
- Szabolcs Káli
Hungarian Brain Research Program (2017-1.2.1-NKP-2017-00002)
- Norbert Hájos
European Commission (ERC 2011 ADG 294313)
- Tamás Freund
- Attila I Gulyás
- Szabolcs Káli
European Commission (FP7 no. 604102,H2020 no. 720270,no. 785907 (Human Brain Project))
- Tamás Freund
- Attila I Gulyás
- Szabolcs Káli
Hungarian Ministry of Innovation and Technology NRDI Office (Artificial Intelligence National Laboratory)
- Szabolcs Káli
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Ethics
Animal experimentation: Experiments were approved by the Committee for the Scientific Ethics of Animal Research (22.1/4027/003/2009) and were performed according to the guidelines of the institutional ethical code and the Hungarian Act of Animal Care and Experimentation. Experiments were performed in acute brain slices; no animal suffering was involved as mice were deeply anaesthetized with isoflurane and decapitated before slice preparation. Data recorded in the context of other studies were used for model fitting, and therefore no additional animals were used for the purpose of this study.
Copyright
© 2022, Ecker et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 4,594
- views
-
- 612
- downloads
-
- 33
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Computational and Systems Biology
- Neuroscience
Audiovisual information reaches the brain via both sustained and transient input channels, representing signals’ intensity over time or changes thereof, respectively. To date, it is unclear to what extent transient and sustained input channels contribute to the combined percept obtained through multisensory integration. Based on the results of two novel psychophysical experiments, here we demonstrate the importance of the transient (instead of the sustained) channel for the integration of audiovisual signals. To account for the present results, we developed a biologically inspired, general-purpose model for multisensory integration, the multisensory correlation detectors, which combines correlated input from unimodal transient channels. Besides accounting for the results of our psychophysical experiments, this model could quantitatively replicate several recent findings in multisensory research, as tested against a large collection of published datasets. In particular, the model could simultaneously account for the perceived timing of audiovisual events, multisensory facilitation in detection tasks, causality judgments, and optimal integration. This study demonstrates that several phenomena in multisensory research that were previously considered unrelated, all stem from the integration of correlated input from unimodal transient channels.
-
- Computational and Systems Biology
Live-cell microscopy routinely provides massive amounts of time-lapse images of complex cellular systems under various physiological or therapeutic conditions. However, this wealth of data remains difficult to interpret in terms of causal effects. Here, we describe CausalXtract, a flexible computational pipeline that discovers causal and possibly time-lagged effects from morphodynamic features and cell–cell interactions in live-cell imaging data. CausalXtract methodology combines network-based and information-based frameworks, which is shown to discover causal effects overlooked by classical Granger and Schreiber causality approaches. We showcase the use of CausalXtract to uncover novel causal effects in a tumor-on-chip cellular ecosystem under therapeutically relevant conditions. In particular, we find that cancer-associated fibroblasts directly inhibit cancer cell apoptosis, independently from anticancer treatment. CausalXtract uncovers also multiple antagonistic effects at different time delays. Hence, CausalXtract provides a unique computational tool to interpret live-cell imaging data for a range of fundamental and translational research applications.