1. Computational and Systems Biology
  2. Neuroscience
Download icon

Fast and flexible sequence induction in spiking neural networks via rapid excitability changes

  1. Rich Pang  Is a corresponding author
  2. Adrienne L Fairhall
  1. University of Washington, United States
Research Article
  • Cited 2
  • Views 2,055
  • Annotations
Cite this article as: eLife 2019;8:e44324 doi: 10.7554/eLife.44324

Abstract

Cognitive flexibility likely depends on modulation of the dynamics underlying how biological neural networks process information. While dynamics can be reshaped by gradually modifying connectivity, less is known about mechanisms operating on faster timescales. A compelling entrypoint to this problem is the observation that exploratory behaviors can rapidly cause selective hippocampal sequences to 'replay' during rest. Using a spiking network model, we asked whether simplified replay could arise from three biological components: fixed recurrent connectivity; stochastic 'gating' inputs; and rapid gating input scaling via long-term potentiation of intrinsic excitability (LTP-IE). Indeed, these enabled both forward and reverse replay of recent sensorimotor-evoked sequences, despite unchanged recurrent weights. LTP-IE 'tags' specific neurons with increased spiking probability under gating input, and ordering is reconstructed from recurrent connectivity. We further show how LTP-IE can implement temporary stimulus-response mappings. This elucidates a novel combination of mechanisms that might play a role in rapid cognitive flexibility.

Article and author information

Author details

  1. Rich Pang

    Physiology and Biophysics Department, University of Washington, Seattle, United States
    For correspondence
    rpang@uw.edu
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-2644-6110
  2. Adrienne L Fairhall

    Physiology and Biophysics Department, University of Washington, Seattle, United States
    Competing interests
    The authors declare that no competing interests exist.

Funding

National Institutes of Health (R01DC013693)

  • Adrienne L Fairhall

Simons Foundation (Collaboration for the Global Brain)

  • Adrienne L Fairhall

University of Washington (Computational Neuroscience Training Grant)

  • Rich Pang

Washington Research Foundation (UW Institute for Neuroengineering)

  • Adrienne L Fairhall

National Institutes of Health (NIH) (R01NS104925)

  • Adrienne L Fairhall

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Reviewing Editor

  1. Emilio Salinas, Wake Forest School of Medicine, United States

Publication history

  1. Received: December 12, 2018
  2. Accepted: May 11, 2019
  3. Accepted Manuscript published: May 13, 2019 (version 1)
  4. Version of Record published: May 28, 2019 (version 2)

Copyright

© 2019, Pang & Fairhall

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 2,055
    Page views
  • 306
    Downloads
  • 2
    Citations

Article citation count generated by polling the highest count across the following sources: Crossref, PubMed Central, Scopus.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Download citations (links to download the citations from this article in formats compatible with various reference manager tools)

Open citations (links to open the citations from this article in various online reference manager services)

Further reading

    1. Computational and Systems Biology
    2. Neuroscience
    Davide Spalla et al.
    Research Article

    Episodic memory has a dynamic nature: when we recall past episodes, we retrieve not only their content, but also their temporal structure. The phenomenon of replay, in the hippocampus of mammals, offers a remarkable example of this temporal dynamics. However, most quantitative models of memory treat memories as static configurations, neglecting the temporal unfolding of the retrieval process. Here, we introduce a continuous attractor network model with a memory-dependent asymmetric component in the synaptic connectivity, which spontaneously breaks the equilibrium of the memory configurations and produces dynamic retrieval. The detailed analysis of the model with analytical calculations and numerical simulations shows that it can robustly retrieve multiple dynamical memories, and that this feature is largely independent of the details of its implementation. By calculating the storage capacity, we show that the dynamic component does not impair memory capacity, and can even enhance it in certain regimes.

    1. Computational and Systems Biology
    2. Neuroscience
    Dhruva Raman, Timothy O'Leary
    Research Article

    Synaptic connections in many brain circuits fluctuate, exhibiting substantial turnover and remodelling over hours to days. Surprisingly, experiments show that most of this flux in connectivity persists in the absence of learning or known plasticity signals. How can neural circuits retain learned information despite a large proportion of ongoing and potentially disruptive synaptic changes? We address this question from first principles by analysing how much compensatory plasticity would be required to optimally counteract ongoing fluctuations, regardless of whether fluctuations are random or systematic. Remarkably, we find that the answer is largely independent of plasticity mechanisms and circuit architectures: compensatory plasticity should be at most equal in magnitude to fluctuations, and often less, in direct agreement with previously unexplained experimental observations. Moreover, our analysis shows that a high proportion of learning-independent synaptic change is consistent with plasticity mechanisms that accurately compute error gradients.