Unified pre- and postsynaptic long-term plasticity enables reliable and flexible learning
Abstract
Although it is well known that long-term synaptic plasticity can be expressed both pre- and postsynaptically, the functional consequences of this arrangement have remained elusive. We show that spike-timing-dependent plasticity with both pre- and postsynaptic expression develops receptive fields with reduced variability and improved discriminability compared to postsynaptic plasticity alone. These long-term modifications in receptive field statistics match recent sensory perception experiments. Moreover, learning with this form of plasticity leaves a hidden postsynaptic memory trace that enables fast relearning of previously stored information, providing a cellular substrate for memory savings. Our results reveal essential roles for presynaptic plasticity that are missed when only postsynaptic expression of long-term plasticity is considered, and suggest an experience-dependent distribution of pre- and postsynaptic strength changes.
Article and author information
Author details
Copyright
© 2015, Ponte Costa et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 4,432
- views
-
- 962
- downloads
-
- 41
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Computational and Systems Biology
- Neuroscience
Audiovisual information reaches the brain via both sustained and transient input channels, representing signals’ intensity over time or changes thereof, respectively. To date, it is unclear to what extent transient and sustained input channels contribute to the combined percept obtained through multisensory integration. Based on the results of two novel psychophysical experiments, here we demonstrate the importance of the transient (instead of the sustained) channel for the integration of audiovisual signals. To account for the present results, we developed a biologically inspired, general-purpose model for multisensory integration, the multisensory correlation detectors, which combines correlated input from unimodal transient channels. Besides accounting for the results of our psychophysical experiments, this model could quantitatively replicate several recent findings in multisensory research, as tested against a large collection of published datasets. In particular, the model could simultaneously account for the perceived timing of audiovisual events, multisensory facilitation in detection tasks, causality judgments, and optimal integration. This study demonstrates that several phenomena in multisensory research that were previously considered unrelated, all stem from the integration of correlated input from unimodal transient channels.