Neuronal reactivation during post-learning sleep consolidates long-term memory in Drosophila

  1. Ugur Dag
  2. Zhengchang Lei
  3. Jasmine Q Le
  4. Allan Wong
  5. Daniel Bushey
  6. Krystyna Keleman  Is a corresponding author
  1. Janelia Research Campus, Howard Hughes Medical Institute, United States

Abstract

Animals consolidate some, but not all, learning experiences into long-term memory. Across the animal kingdom, sleep has been found to have a beneficial effect on the consolidation of recently formed memories into long-term storage. However, the underlying mechanisms of sleep dependent memory consolidation are poorly understood. Here, we show that consolidation of courtship long-term memory in Drosophila is mediated by reactivation during sleep of dopaminergic neurons that were earlier involved in memory acquisition. We identify specific fan-shaped body neurons that induce sleep after the learning experience and activate dopaminergic neurons for memory consolidation. Thus, we provide a direct link between sleep, neuronal reactivation of dopaminergic neurons, and memory consolidation.

Data availability

Source data files have been provided for Figure 1-figure supplement 1 and 2, Figure 2, Figure 2-figure supplement 1 and 2 and Figure 5 and Figure 5-figure supplement

Article and author information

Author details

  1. Ugur Dag

    Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-6937-5722
  2. Zhengchang Lei

    Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-6475-5010
  3. Jasmine Q Le

    Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-4159-8830
  4. Allan Wong

    Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-8492-2162
  5. Daniel Bushey

    Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
    Competing interests
    The authors declare that no competing interests exist.
  6. Krystyna Keleman

    Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
    For correspondence
    kelemank@janelia.hhmi.org
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-2044-1981

Funding

Howard Hughes Medical Institute (N/A)

  • Krystyna Keleman

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Copyright

© 2019, Dag et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 6,465
    views
  • 1,020
    downloads
  • 65
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Ugur Dag
  2. Zhengchang Lei
  3. Jasmine Q Le
  4. Allan Wong
  5. Daniel Bushey
  6. Krystyna Keleman
(2019)
Neuronal reactivation during post-learning sleep consolidates long-term memory in Drosophila
eLife 8:e42786.
https://doi.org/10.7554/eLife.42786

Share this article

https://doi.org/10.7554/eLife.42786

Further reading

    1. Developmental Biology
    2. Neuroscience
    Taro Ichimura, Taishi Kakizuka ... Takeharu Nagai
    Tools and Resources

    We established a volumetric trans-scale imaging system with an ultra-large field-of-view (FOV) that enables simultaneous observation of millions of cellular dynamics in centimeter-wide three-dimensional (3D) tissues and embryos. Using a custom-made giant lens system with a magnification of ×2 and a numerical aperture (NA) of 0.25, and a CMOS camera with more than 100 megapixels, we built a trans-scale scope AMATERAS-2, and realized fluorescence imaging with a transverse spatial resolution of approximately 1.1 µm across an FOV of approximately 1.5×1.0 cm2. The 3D resolving capability was realized through a combination of optical and computational sectioning techniques tailored for our low-power imaging system. We applied the imaging technique to 1.2 cm-wide section of mouse brain, and successfully observed various regions of the brain with sub-cellular resolution in a single FOV. We also performed time-lapse imaging of a 1-cm-wide vascular network during quail embryo development for over 24 hr, visualizing the movement of over 4.0×105 vascular endothelial cells and quantitatively analyzing their dynamics. Our results demonstrate the potential of this technique in accelerating production of comprehensive reference maps of all cells in organisms and tissues, which contributes to understanding developmental processes, brain functions, and pathogenesis of disease, as well as high-throughput quality check of tissues used for transplantation medicine.

    1. Neuroscience
    Sean M Perkins, Elom A Amematsro ... Mark M Churchland
    Research Article

    Decoders for brain-computer interfaces (BCIs) assume constraints on neural activity, chosen to reflect scientific beliefs while yielding tractable computations. Recent scientific advances suggest that the true constraints on neural activity, especially its geometry, may be quite different from those assumed by most decoders. We designed a decoder, MINT, to embrace statistical constraints that are potentially more appropriate. If those constraints are accurate, MINT should outperform standard methods that explicitly make different assumptions. Additionally, MINT should be competitive with expressive machine learning methods that can implicitly learn constraints from data. MINT performed well across tasks, suggesting its assumptions are well-matched to the data. MINT outperformed other interpretable methods in every comparison we made. MINT outperformed expressive machine learning methods in 37 of 42 comparisons. MINT’s computations are simple, scale favorably with increasing neuron counts, and yield interpretable quantities such as data likelihoods. MINT’s performance and simplicity suggest it may be a strong candidate for many BCI applications.