Reconstruction of natural images from responses of primate retinal ganglion cells
Abstract
The visual message conveyed by a retinal ganglion cell (RGC) is often summarized by its spatial receptive field, but in principle also depends on the responses of other RGCs and natural image statistics. This possibility was explored by linear reconstruction of natural images from responses of the four numerically-dominant macaque RGC types. Reconstructions were highly consistent across retinas. The optimal reconstruction filter for each RGC – its visual message – reflected natural image statistics, and resembled the receptive field only when nearby, same-type cells were included. ON and OFF cells conveyed largely independent, complementary representations, and parasol and midget cells conveyed distinct features. Correlated activity and nonlinearities had statistically significant but minor effects on reconstruction. Simulated reconstructions, using linear-nonlinear cascade models of RGC light responses that incorporated measured spatial properties and nonlinearities, produced similar results. Spatiotemporal reconstructions exhibited similar spatial properties, suggesting that the results are relevant for natural vision.
Data availability
Code and data to generate all of the summary plots are included in the supporting files. We are not able to release the raw voltage recordings, which total >5 TBs and require a complex processing pipeline. This paper is only the first analysis using these large data sets, which were collected over many years, and are still in use by students in our lab for other projects and papers funded by grants that were used to acquire them in a lab-wide collaboration. We will be happy to work directly with specific researchers to release additional data to them for the purposes of replication only, but not for further use, until we have had an opportunity to complete our analysis of the data and the PhD students doing this work have been able to publish their findings.
Article and author information
Author details
Funding
National Science Foundation (NSF IGERT 0801700)
- Nora Brackbill
National Science Foundation (CRCNS Grant IIS-1430348)
- E J Chichilnisky
National Science Foundation (GRFP DGE-114747)
- Nora Brackbill
- Colleen Rhoades
National Eye Institute (F31EY027166)
- Colleen Rhoades
Pew Charitable Trusts (Fellowship in Biomedical Sciences)
- Alexander Sher
John Chen (donation)
- Alan M Litke
National Institutes of Health (R01EY017992)
- E J Chichilnisky
National Institutes of Health (R01-EY029247)
- E J Chichilnisky
National Eye Institute (R01-EY029247)
- E J Chichilnisky
National Institutes of Health (CRCNS Grant IIS-1430348)
- E J Chichilnisky
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Reviewing Editor
- Markus Meister, California Institute of Technology, United States
Ethics
Animal experimentation: Animal experimentation: Eyes were removed from terminally anesthetized macaque monkeys (Macaca mulatta, Macaca fascicularis) used by other laboratories in the course of their experiments, in accordance with the Institutional Animal Care and Use Committee guidelines. All of the animals were handled according to approved institutional animal care and use committee (IACUC) protocols (#28860) of the Stanford University. The protocol was approved by the Administrative Panel on Laboratory Animal Care of the Stanford University (Assurance Number: A3213-01).
Version history
- Received: May 2, 2020
- Accepted: November 2, 2020
- Accepted Manuscript published: November 4, 2020 (version 1)
- Version of Record published: December 21, 2020 (version 2)
Copyright
© 2020, Brackbill et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 2,607
- views
-
- 424
- downloads
-
- 31
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
Probing memory of a complex visual image within a few hundred milliseconds after its disappearance reveals significantly greater fidelity of recall than if the probe is delayed by as little as a second. Classically interpreted, the former taps into a detailed but rapidly decaying visual sensory or ‘iconic’ memory (IM), while the latter relies on capacity-limited but comparatively stable visual working memory (VWM). While iconic decay and VWM capacity have been extensively studied independently, currently no single framework quantitatively accounts for the dynamics of memory fidelity over these time scales. Here, we extend a stationary neural population model of VWM with a temporal dimension, incorporating rapid sensory-driven accumulation of activity encoding each visual feature in memory, and a slower accumulation of internal error that causes memorized features to randomly drift over time. Instead of facilitating read-out from an independent sensory store, an early cue benefits recall by lifting the effective limit on VWM signal strength imposed when multiple items compete for representation, allowing memory for the cued item to be supplemented with information from the decaying sensory trace. Empirical measurements of human recall dynamics validate these predictions while excluding alternative model architectures. A key conclusion is that differences in capacity classically thought to distinguish IM and VWM are in fact contingent upon a single resource-limited WM store.
-
- Neuroscience
Our ability to recall details from a remembered image depends on a single mechanism that is engaged from the very moment the image disappears from view.