Can sleep protect memories from catastrophic forgetting?
Abstract
Continual learning remains to be an unsolved problem in artificial neural networks. The brain has evolved mechanisms to prevent catastrophic forgetting of old knowledge during new training. Building upon data suggesting importance of sleep in learning and memory, we tested a hypothesis that sleep protects old memories from forgetting. In the thalamocortical model, training a new memory interfered with previously learned old memories leading to degradation and forgetting of the old memory traces. Simulating sleep immediately after new learning reversed the damage and enhanced all memories. We found that when a new memory competed for previously allocated neuronal/synaptic resources, sleep replay changed the synaptic footprint of the old memory to allow overlapping neuronal populations to store multiple memories. Our study predicts that memory storage is dynamic, and sleep enables continual learning by combining consolidation of new memory traces with reconsolidation of old memory traces to minimize interference.
Data availability
Computational models were used exclusively in this study. The model is fully described in the Methods section and code has been deposited to https://github.com/o2gonzalez/sequenceLearningSleepCode.
Article and author information
Author details
Funding
Defense Advanced Research Projects Agency (HR0011-18-2-0021)
- Maxim Bazhenov
Office of Naval Research (MURI: N00014-16-1-2829)
- Maxim Bazhenov
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Copyright
© 2020, González et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 5,653
- views
-
- 787
- downloads
-
- 40
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.