Theory of systems memory consolidation via recall-gated plasticity

  1. Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY 10027, USA

Peer review process

Not revised: This Reviewed Preprint includes the authors’ original preprint (without revision), an eLife assessment, and public reviews.

Read more about eLife’s peer review process.

Editors

  • Reviewing Editor
    Julijana Gjorgjieva
    Technical University of Munich, Freising, Germany
  • Senior Editor
    Michael Frank
    Brown University, Providence, United States of America

Reviewer #1 (Public Review):

Summary:

The authors develop a memory consolidation theory utilizing the recall quality in the short-term memory system to decide what to consolidate in the long-term memory (LTM). The theory is based on a set of previously proposed models identifying memories and synaptic weights (without neuronal activity) with an addition of the second set of weights responsible for long-term storage. The rigorous analysis and numerical experiments show that under some assumptions, the long-term system achieves a high signal-to-noise ratio, particularly much higher than concurrently learning or localized in the same synapses LTM.

Strengths:

The authors take on an important problem of designing robust memory consolidation that fits the numerous experimental observations and, to a large extent, they succeed. The proposed solution is general and generalized to multiple contexts. The mathematical treatment is solid and convincing.

Weaknesses:

The presented model seems to be tuned for learning repetitive events. However, single-shot learning, for example, under fear conditioning or if a presented stimulus is astonishing, seems to contradict the proposed framework. I would assume that part of the load could be taken by a reply system that could vigorously replay more surprising events, but it seems to still not exactly match the proposed scheme.

For context, I would like to see the comparison/discussion of the wide range of models on synaptic tagging for consolidation by various types of signals. Notably, studies from Wulfram Gerstner's group (e.g., Brea, J., Clayton, N. S., & Gerstner, W. (2023). Computational models of episodic-like memory in food-caching birds. Nature Communications, 14(1); and studies on surprise).

The models that are taken for comparison with the slow but otherwise identical to STM LTM could be incapable per design. Reducing the probability of switching independently of the previous presentation does not make the system "slow"; instead, it should integrate previous signals (and thus slowly remove independent noise).

The usage of terms and streamlining of writing could be improved for better understanding.

Reviewer #2 (Public Review):

Summary:

In the manuscript "Recall-Gated Consolidation: A Model for Learning and Memory in Neural Systems," the authors suggest a computational mechanism called recall-gated consolidation, which prioritizes the storage of previously experienced synaptic updates in memory. The authors investigate the mechanism with different types of learning problems including supervised learning, reinforcement learning, and unsupervised auto-associative memory. They rigorously analyse the general mechanism and provide valuable insights into its benefits.

Strengths:

The authors establish a general theoretical framework, which they translate into three concrete learning problems. For each, they define an individual mathematical formulation. Finally, they extensively analyse the suggested mechanism in terms of memory recall, consolidation dynamics, and learnable timescales.

The presented model of recall-gated consolidation covers various aspects of synaptic plasticity, memory recall, and the influence of gating functions on memory storage and retrieval. The model's predictions align with observed spaced learning effects.

The authors conduct simulations to validate the recall-gated consolidation model's predictions, and their simulated results align with theoretical predictions. These simulations demonstrate the model's advantages over consolidating any memory and showcase its potential application to various learning tasks.

The suggestion of a novel consolidation mechanism provides a good starting point to investigate memory consolidation in diverse neural systems and may inspire artificial learning algorithms.

Weaknesses:

I appreciate that the authors devoted a specific section to the model's predictions, and point out how the model connects to experimental findings in various model organisms. However, the connection is rather weak and the model needs to make more specific predictions to be distinguishable from other theories of memory consolidation (e.g. those that the authors discuss) and verifiable by experimental data.

While the article extensively discusses the strengths and advantages of the recall-gated consolidation model, it provides a limited discussion of potential limitations or shortcomings of the model, such as the missing feature of generalization, which is part of previous consolidation models. The model is not compared to other consolidation models in terms of performance and how much it increases the signal-to-noise ratio. It is only compared to a simple STM or a parallel LTM, which I understand to be essentially the same as the STM but with a different timescale (so not really an alternative consolidation model). It would be nice to compare the model to an actual or more sophisticated existing consolidation model to allow for a fairer comparison.

The article is lengthy and dense and it could be clearer. Some sections are highly technical and may be challenging to follow. It could benefit from more concise summaries and visual aids to help convey key points.

Reviewer #3 (Public Review):

Summary:

In their article "Theory of systems memory consolidation via recall-gated plasticity ", Jack Lindsey and Ashok Litwin-Kumar describe a new model for systems memory consolidation. Their idea is that a short-term memory acts not as a teacher for a long-term memory - as is common in most complementary learning systems - but as a selection module that determines which memories are eligible for long-term storage. The criterion for the consolidation of a given memory is a sufficient strength of recall in the short-term memory.

The authors provide an in-depth analysis of the suggested mechanism. They demonstrate that it allows substantially higher SNRs than previous synaptic consolidation models, provide an extensive mathematical treatment of the suggested mechanism, show that the required recall strength can be computed in a biologically plausible way for three different learning paradigms, and illustrate how the mechanism can explain spaced training effects.

Strengths:

The suggested consolidation mechanism is novel and provides a very interesting alternative to the classical view of complementary learning systems. The analysis is thorough and convincing.

Weaknesses:

The main weakness of the paper is the equation of recall strength with the synaptic changes brought about by the presentation of a stimulus. In most models of learning, synaptic changes are driven by an error signal and hence cease once the task has been learned. The suggested consolidation mechanism would stop at that point, although recall is still fine. The authors should discuss other notions of recall strength that would allow memory consolidation to continue after the initial learning phase. Aside from that, I have only a few technical comments that I'm sure the authors can address with a reasonable amount of work.

  1. Howard Hughes Medical Institute
  2. Wellcome Trust
  3. Max-Planck-Gesellschaft
  4. Knut and Alice Wallenberg Foundation