Visual and motor signatures of locomotion dynamically shape apopulation code for feature detection in Drosophila
Abstract
Natural vision is dynamic: as an animal moves, its visual input changes dramatically. How can the visual system reliably extract local features from an input dominated by self-generated signals? In Drosophila, diverse local visual features are represented by a group of projection neurons with distinct tuning properties. Here we describe a connectome-based volumetric imaging strategy to measure visually evoked neural activity across this population. We show that local visual features are jointly represented across the population, and that a shared gain factor improves trial-to-trial coding fidelity. A subset of these neurons, tuned to small objects, is modulated by two independent signals associated with self-movement, a motor-related signal and a visual motion signal associated with rotation of the animal. These two inputs adjust the sensitivity of these feature detectors across the locomotor cycle, selectively reducing their gain during saccades and restoring it during intersaccadic intervals. This work reveals a strategy for reliable feature detection during locomotion.
Data availability
All software and code is available on GitHub. Main analysis, modeling and figure generation code can be found here: https://github.com/mhturner/glom_pop; Visual stimulus code can be found here: https://github.com/ClandininLab/visanalysis and here: https://github.com/ClandininLab/flystim. Extracted ROI responses and associated stimulus metadata, along with raw imaging data, can be found in a Dryad repository here: https://doi.org/10.5061/dryad.h44j0zpp8.
-
Data from: Visual and motor signatures of locomotion dynamically shape a population code for feature detection in DrosophilaDryad Digital Repository, doi:10.5061/dryad.h44j0zpp8.
Article and author information
Author details
Funding
National Institutes of Health (F32-MH118707)
- Maxwell H Turner
National Institutes of Health (K99-EY032549)
- Maxwell H Turner
National Institutes of Health (R01-EY022638)
- Thomas R Clandinin
National Institutes of Health (R01NS110060)
- Thomas R Clandinin
National Science Foundation (GRFP)
- Avery Krieger
National Defense Science and Engineering Graduate (Fellowship)
- Michelle M Pang
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Copyright
© 2022, Turner et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 2,036
- views
-
- 252
- downloads
-
- 20
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
Identical stimuli can be perceived or go unnoticed across successive presentations, producing divergent behavioral outcomes despite similarities in sensory input. We sought to understand how fluctuations in behavioral state and cortical layer and cell class-specific neural activity underlie this perceptual variability. We analyzed physiological measurements of state and laminar electrophysiological activity in visual area V4 while monkeys were rewarded for correctly reporting a stimulus change at perceptual threshold. Hit trials were characterized by a behavioral state with heightened arousal, greater eye position stability, and enhanced decoding performance of stimulus identity from neural activity. Target stimuli evoked stronger responses in V4 in hit trials, and excitatory neurons in the superficial layers, the primary feed-forward output of the cortical column, exhibited lower variability. Feed-forward interlaminar population correlations were stronger on hits. Hit trials were further characterized by greater synchrony between the output layers of the cortex during spontaneous activity, while the stimulus-evoked period showed elevated synchrony in the feed-forward pathway. Taken together, these results suggest that a state of elevated arousal and stable retinal images allow enhanced processing of sensory stimuli, which contributes to hits at perceptual threshold.
-
- Neuroscience
Significant technical challenges exist when measuring synaptic connections between neurons in living brain tissue. The patch clamping technique, when used to probe for synaptic connections, is manually laborious and time-consuming. To improve its efficiency, we pursued another approach: instead of retracting all patch clamping electrodes after each recording attempt, we cleaned just one of them and reused it to obtain another recording while maintaining the others. With one new patch clamp recording attempt, many new connections can be probed. By placing one pipette in front of the others in this way, one can ‘walk’ across the mouse brain slice, termed ‘patch-walking.’ We performed 136 patch clamp attempts for two pipettes, achieving 71 successful whole cell recordings (52.2%). Of these, we probed 29 pairs (i.e. 58 bidirectional probed connections) averaging 91 μm intersomatic distance, finding three connections. Patch-walking yields 80–92% more probed connections, for experiments with 10–100 cells than the traditional synaptic connection searching method.