Peer review process
Not revised: This Reviewed Preprint includes the authors’ original preprint (without revision), an eLife assessment, and public reviews.
Read more about eLife’s peer review process.Editors
- Reviewing EditorPeter LathamUniversity College London, London, United Kingdom
- Senior EditorFloris de LangeDonders Institute for Brain, Cognition and Behaviour, Nijmegen, Netherlands
Joint Public Review:
Summary:
Given the cost of producing action potentials and transmitting them along axons, it has always seemed a bit strange that there are synaptic failures: when a spike arrives at a synapse, about half the time nothing happens. One explanation comes from a Bayesian inference perspective: because of noise and limited information, the best a synapse can do is compute a probability distribution over its true weight; to communicate the resulting uncertainty it samples from that distribution. In this view, failures are a means of sampling from a synapse's probability distribution. Here the authors offer another explanation: energy efficiency. In this view, synaptic parameters (mean and variance of the synaptic weights) are adapted to perform some task while penalising small variances, which, the authors show, are energetically expensive.
The authors show both numerically and analytically the strong link between those two frameworks. In particular, both frameworks predict that (a) synaptic variance should decrease when the input firing rate increases and (b) the learning rate should increase when the weight variances increase. Both predictions have some experimental support.
Finally, the authors relate the cost of small variance to the cost used in variational Bayesian inference. Intriguingly, the biophysical cost provides a lower bound on the variational inference cost. This is intellectually satisfying, as it answers a "why" question: why would evolution evolve to produce the kind of costs seen in the brain?
Strengths:
1. The paper is very well written and the arguments are clearly presented. The tight link between the Bayesian inference and energy efficiency perspectives is elegant and well-supported, both with numerical simulations as well as with analytical arguments.
2. A key component of the paper is the derivation of the reliability cost as a function of different biophysical mechanisms (calcium efflux, vesicle membrane, actin, and trafficking). Independent of the proposed mapping between the Bayesian inference perspective and the energy efficiency perspective, those reliability costs (expressed as power-law relationships) will be important for further studies on synaptic energetics.
3. The extended appendices, which are generally easy to read, provide additional mathematical insight.
Weaknesses:
1. The authors face a technical challenge (which they acknowledge): they use two numbers (mean and variance) to characterize synaptic variability, whereas in the brain there are three numbers (number of vesicles, release probability, and quantal size). Turning biological constraints into constraints on the variance, as is done in the paper, seems somewhat arbitrary. This by no means invalidates the results, but it means that future experimental tests of their model will be somewhat nuanced.
2. The prediction that the learning rate should increase with variability relies on an optimization scheme in which the learning rate is scaled by the inverse of the magnitude of the gradients (Eq. 7). This seems like an extra assumption; the energy efficiency framework by itself does not predict that the learning rate should increase with variability. Further work will be needed to disentangle the assumption about the optimization scheme from the energy efficiency framework.