Visual and motor signatures of locomotion dynamically shape a population code for feature detection in Drosophila

Abstract

Natural vision is dynamic: as an animal moves, its visual input changes dramatically. How can the visual system reliably extract local features from an input dominated by self-generated signals? In Drosophila, diverse local visual features are represented by a group of projection neurons with distinct tuning properties. Here we describe a connectome-based volumetric imaging strategy to measure visually evoked neural activity across this population. We show that local visual features are jointly represented across the population, and that a shared gain factor improves trial-to-trial coding fidelity. A subset of these neurons, tuned to small objects, is modulated by two independent signals associated with self-movement, a motor-related signal and a visual motion signal associated with rotation of the animal. These two inputs adjust the sensitivity of these feature detectors across the locomotor cycle, selectively reducing their gain during saccades and restoring it during intersaccadic intervals. This work reveals a strategy for reliable feature detection during locomotion.

Data availability

All software and code is available on GitHub. Main analysis, modeling and figure generation code can be found here: https://github.com/mhturner/glom_pop; Visual stimulus code can be found here: https://github.com/ClandininLab/visanalysis and here: https://github.com/ClandininLab/flystim. Extracted ROI responses and associated stimulus metadata, along with raw imaging data, can be found in a Dryad repository here: https://doi.org/10.5061/dryad.h44j0zpp8.

The following data sets were generated

Article and author information

Author details

  1. Maxwell H Turner

    Department of Neurobiology, Stanford University, Stanford, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-4164-9995
  2. Avery Krieger

    Department of Neurobiology, Stanford University, Stanford, United States
    Competing interests
    The authors declare that no competing interests exist.
  3. Michelle M Pang

    Department of Neurobiology, Stanford University, Stanford, United States
    Competing interests
    The authors declare that no competing interests exist.
  4. Thomas R Clandinin

    Department of Neurobiology, Stanford University, Stanford, United States
    For correspondence
    trc@stanford.edu
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-6277-6849

Funding

National Institutes of Health (F32-MH118707)

  • Maxwell H Turner

National Institutes of Health (K99-EY032549)

  • Maxwell H Turner

National Institutes of Health (R01-EY022638)

  • Thomas R Clandinin

National Institutes of Health (R01NS110060)

  • Thomas R Clandinin

National Science Foundation (GRFP)

  • Avery Krieger

National Defense Science and Engineering Graduate (Fellowship)

  • Michelle M Pang

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Reviewing Editor

  1. Matthieu Louis, University of California, Santa Barbara, United States

Version history

  1. Preprint posted: July 14, 2022 (view preprint)
  2. Received: August 10, 2022
  3. Accepted: October 25, 2022
  4. Accepted Manuscript published: October 27, 2022 (version 1)
  5. Accepted Manuscript updated: October 28, 2022 (version 2)
  6. Version of Record published: November 11, 2022 (version 3)

Copyright

© 2022, Turner et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 1,746
    views
  • 216
    downloads
  • 13
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Maxwell H Turner
  2. Avery Krieger
  3. Michelle M Pang
  4. Thomas R Clandinin
(2022)
Visual and motor signatures of locomotion dynamically shape a population code for feature detection in Drosophila
eLife 11:e82587.
https://doi.org/10.7554/eLife.82587

Share this article

https://doi.org/10.7554/eLife.82587

Further reading

    1. Neuroscience
    Mohsen Sadeghi, Reza Sharif Razavian ... Dagmar Sternad
    Research Article

    Natural behaviors have redundancy, which implies that humans and animals can achieve their goals with different strategies. Given only observations of behavior, is it possible to infer the control objective that the subject is employing? This challenge is particularly acute in animal behavior because we cannot ask or instruct the subject to use a particular strategy. This study presents a three-pronged approach to infer an animal’s control objective from behavior. First, both humans and monkeys performed a virtual balancing task for which different control strategies could be utilized. Under matched experimental conditions, corresponding behaviors were observed in humans and monkeys. Second, a generative model was developed that represented two main control objectives to achieve the task goal. Model simulations were used to identify aspects of behavior that could distinguish which control objective was being used. Third, these behavioral signatures allowed us to infer the control objective used by human subjects who had been instructed to use one control objective or the other. Based on this validation, we could then infer objectives from animal subjects. Being able to positively identify a subject’s control objective from observed behavior can provide a powerful tool to neurophysiologists as they seek the neural mechanisms of sensorimotor coordination.

    1. Neuroscience
    Yiyi Chen, Laimdota Zizmare ... Christoph Trautwein
    Research Article

    The retina consumes massive amounts of energy, yet its metabolism and substrate exploitation remain poorly understood. Here, we used a murine explant model to manipulate retinal energy metabolism under entirely controlled conditions and utilised 1H-NMR spectroscopy-based metabolomics, in situ enzyme detection, and cell viability readouts to uncover the pathways of retinal energy production. Our experimental manipulations resulted in varying degrees of photoreceptor degeneration, while the inner retina and retinal pigment epithelium were essentially unaffected. This selective vulnerability of photoreceptors suggested very specific adaptations in their energy metabolism. Rod photoreceptors were found to rely strongly on oxidative phosphorylation, but only mildly on glycolysis. Conversely, cone photoreceptors were dependent on glycolysis but insensitive to electron transport chain decoupling. Importantly, photoreceptors appeared to uncouple glycolytic and Krebs-cycle metabolism via three different pathways: (1) the mini-Krebs-cycle, fuelled by glutamine and branched chain amino acids, generating N-acetylaspartate; (2) the alanine-generating Cahill-cycle; (3) the lactate-releasing Cori-cycle. Moreover, the metabolomics data indicated a shuttling of taurine and hypotaurine between the retinal pigment epithelium and photoreceptors, likely resulting in an additional net transfer of reducing power to photoreceptors. These findings expand our understanding of retinal physiology and pathology and shed new light on neuronal energy homeostasis and the pathogenesis of neurodegenerative diseases.