Emotional learning retroactively promotes memory integration through rapid neural reactivation and reorganization

  1. Yannan Zhu
  2. Yimeng Zeng
  3. Jingyuan Ren
  4. Lingke Zhang
  5. Changming Chen
  6. Guillén Fernández
  7. Shaozheng Qin  Is a corresponding author
  1. Beijing Normal University, China
  2. Radboud University Nijmegen Medical Centre, Netherlands
  3. Xinyang Normal University, China

Abstract

Neutral events preceding emotional experiences can be better remembered, likely by assigning them as significant to guide possible use in future. Yet, the neurobiological mechanisms of how emotional learning enhances memory for past mundane events remain unclear. By two behavioral studies and one functional magnetic resonance imaging study with an adapted sensory preconditioning paradigm, we show rapid neural reactivation and connectivity changes underlying emotion-charged retroactive memory enhancement. Behaviorally, emotional learning enhanced initial memory for neutral associations across the three studies. Neurally, emotional learning potentiated trial-specific reactivation of overlapping neural traces in the hippocampus and stimulus-relevant neocortex. It further induced rapid hippocampal-neocortical functional reorganization supporting such retroactive memory benefit, as characterized by enhanced hippocampal-neocortical coupling modulated by the amygdala during emotional learning, and a shift of hippocampal connectivity from stimulus-relevant neocortex to transmodal prefrontal-parietal areas at post-learning rests. Together, emotional learning retroactively promotes memory integration for past neutral events through stimulating trial-specific reactivation of overlapping representations and reorganization of associated memories into an integrated network to foster its priority for future use.

Data availability

All fMRI data collected in this study are available on OpenNeuro under the accession number ds004109 (https://openneuro.org/datasets/ds004109/versions/1.0.0).All code used for analysis are available on GitHub (https://github.com/QinBrainLab/2017_EmotionLearning.git).

Article and author information

Author details

  1. Yannan Zhu

    State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, China
    Competing interests
    The authors declare that no competing interests exist.
  2. Yimeng Zeng

    State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, China
    Competing interests
    The authors declare that no competing interests exist.
  3. Jingyuan Ren

    Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen Medical Centre, Nijmegen, Netherlands
    Competing interests
    The authors declare that no competing interests exist.
  4. Lingke Zhang

    State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, China
    Competing interests
    The authors declare that no competing interests exist.
  5. Changming Chen

    Department of Psychology, Xinyang Normal University, Xinyang, China
    Competing interests
    The authors declare that no competing interests exist.
  6. Guillén Fernández

    Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen Medical Centre, Nijmegen, Netherlands
    Competing interests
    The authors declare that no competing interests exist.
  7. Shaozheng Qin

    State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, China
    For correspondence
    szqin@bnu.edu.cn
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-1859-2150

Funding

National Natural Science Foundation of China (32130045)

  • Shaozheng Qin

National Natural Science Foundation of China (31522028)

  • Shaozheng Qin

National Natural Science Foundation of China (81571056)

  • Shaozheng Qin

Open Research Fund of the State Key Laboratory of Cognitive Neuroscience and Learning (CNLZD1503)

  • Shaozheng Qin

Chinese Scholarship Council (201806040186)

  • Yannan Zhu

The funders have no role in study design, data collection, interpretation, and decision to submit the work for publication.

Ethics

Human subjects: Informed written consent was obtained from each participant before the experiment. The Institutional Review Board for Human Subjects at Beijing Normal University (ICBIR_A_0098_002), Xinyang Normal University (same as above) and Peking University (IRB#2015-09-04) approved the procedures for Study 1, 2 and 3 respectively.

Reviewing Editor

  1. Thorsten Kahnt, National Institute on Drug Abuse Intramural Research Program, United States

Version history

  1. Received: June 18, 2020
  2. Preprint posted: September 9, 2020 (view preprint)
  3. Accepted: December 6, 2022
  4. Accepted Manuscript published: December 8, 2022 (version 1)
  5. Accepted Manuscript updated: December 9, 2022 (version 2)
  6. Version of Record published: January 5, 2023 (version 3)

Copyright

© 2022, Zhu et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 1,280
    Page views
  • 248
    Downloads
  • 2
    Citations

Article citation count generated by polling the highest count across the following sources: Crossref, PubMed Central, Scopus.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Yannan Zhu
  2. Yimeng Zeng
  3. Jingyuan Ren
  4. Lingke Zhang
  5. Changming Chen
  6. Guillén Fernández
  7. Shaozheng Qin
(2022)
Emotional learning retroactively promotes memory integration through rapid neural reactivation and reorganization
eLife 11:e60190.
https://doi.org/10.7554/eLife.60190

Further reading

    1. Neuroscience
    Max L Sterling, Ruben Teunisse, Bernhard Englitz
    Tools and Resources Updated

    Ultrasonic vocalizations (USVs) fulfill an important role in communication and navigation in many species. Because of their social and affective significance, rodent USVs are increasingly used as a behavioral measure in neurodevelopmental and neurolinguistic research. Reliably attributing USVs to their emitter during close interactions has emerged as a difficult, key challenge. If addressed, all subsequent analyses gain substantial confidence. We present a hybrid ultrasonic tracking system, Hybrid Vocalization Localizer (HyVL), that synergistically integrates a high-resolution acoustic camera with high-quality ultrasonic microphones. HyVL is the first to achieve millimeter precision (~3.4–4.8 mm, 91% assigned) in localizing USVs, ~3× better than other systems, approaching the physical limits (mouse snout ~10 mm). We analyze mouse courtship interactions and demonstrate that males and females vocalize in starkly different relative spatial positions, and that the fraction of female vocalizations has likely been overestimated previously due to imprecise localization. Further, we find that when two male mice interact with one female, one of the males takes a dominant role in the interaction both in terms of the vocalization rate and the location relative to the female. HyVL substantially improves the precision with which social communication between rodents can be studied. It is also affordable, open-source, easy to set up, can be integrated with existing setups, and reduces the required number of experiments and animals.

    1. Neuroscience
    Federico G Segala, Aurelio Bruno ... Daniel H Baker
    Research Article

    How does the human brain combine information across the eyes? It has been known for many years that cortical normalization mechanisms implement ‘ocularity invariance’: equalizing neural responses to spatial patterns presented either monocularly or binocularly. Here, we used a novel combination of electrophysiology, psychophysics, pupillometry, and computational modeling to ask whether this invariance also holds for flickering luminance stimuli with no spatial contrast. We find dramatic violations of ocularity invariance for these stimuli, both in the cortex and also in the subcortical pathways that govern pupil diameter. Specifically, we find substantial binocular facilitation in both pathways with the effect being strongest in the cortex. Near-linear binocular additivity (instead of ocularity invariance) was also found using a perceptual luminance matching task. Ocularity invariance is, therefore, not a ubiquitous feature of visual processing, and the brain appears to repurpose a generic normalization algorithm for different visual functions by adjusting the amount of interocular suppression.