Sensorimotor pathway controlling stopping behavior during chemotaxis in the Drosophila melanogaster larva
Abstract
Sensory navigation results from coordinated transitions between distinct behavioral programs. During chemotaxis in the Drosophila melanogaster larva, the detection of positive odor gradients extends runs while negative gradients promote stops and turns. This algorithm represents a foundation for the control of sensory navigation across phyla. In the present work, we identified an olfactory descending neuron, PDM-DN, which plays a pivotal role in the organization of stops and turns in response to the detection of graded changes in odor concentrations. Artificial activation of this descending neuron induces deterministic stops followed by the initiation of turning maneuvers through head casts. Using electron microscopy, we reconstructed the main pathway that connects the PDM-DN neuron to the peripheral olfactory system and to the pre-motor circuit responsible for the actuation of forward peristalsis. Our results set the stage for a detailed mechanistic analysis of the sensorimotor conversion of graded olfactory inputs into action selection to perform goal-oriented navigation.
Data availability
Scripts for data analysis, source data files for the behavioral and imaging experiments have been made available on the GitHub account of the Louis lab (https://github.com/LabLouis/eLife2018_PDM-DN).
Article and author information
Author details
Funding
Spanish Ministry of Economy and Competitiveness (BFU2011-26208)
- Ibrahim Tastekin
- Avinash Khandelwal
- David Tadres
- Nico D Fessner
- Matthieu Louis
EU Marie Curie FP7 Programme (ITN-FLiACT)
- Ibrahim Tastekin
- Matthieu Louis
Howard Hughes Medical Institute
- James W Truman
- Marta Zlatic
- Albert Cardona
University of California, Santa Barbara
- David Tadres
- Matthieu Louis
La Caixa International PhD program
- Avinash Khandelwal
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Copyright
© 2018, Tastekin et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 5,243
- views
-
- 558
- downloads
-
- 63
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
Recognizing goal-directed actions is a computationally challenging task, requiring not only the visual analysis of body movements, but also analysis of how these movements causally impact, and thereby induce a change in, those objects targeted by an action. We tested the hypothesis that the analysis of body movements and the effects they induce relies on distinct neural representations in superior and anterior inferior parietal lobe (SPL and aIPL). In four fMRI sessions, participants observed videos of actions (e.g. breaking stick, squashing plastic bottle) along with corresponding point-light-display (PLD) stick figures, pantomimes, and abstract animations of agent–object interactions (e.g. dividing or compressing a circle). Cross-decoding between actions and animations revealed that aIPL encodes abstract representations of action effect structures independent of motion and object identity. By contrast, cross-decoding between actions and PLDs revealed that SPL is disproportionally tuned to body movements independent of visible interactions with objects. Lateral occipitotemporal cortex (LOTC) was sensitive to both action effects and body movements. These results demonstrate that parietal cortex and LOTC are tuned to physical action features, such as how body parts move in space relative to each other and how body parts interact with objects to induce a change (e.g. in position or shape/configuration). The high level of abstraction revealed by cross-decoding suggests a general neural code supporting mechanical reasoning about how entities interact with, and have effects on, each other.
-
- Neuroscience
Our movements result in predictable sensory feedback that is often multimodal. Based on deviations between predictions and actual sensory input, primary sensory areas of cortex have been shown to compute sensorimotor prediction errors. How prediction errors in one sensory modality influence the computation of prediction errors in another modality is still unclear. To investigate multimodal prediction errors in mouse auditory cortex, we used a virtual environment to experimentally couple running to both self-generated auditory and visual feedback. Using two-photon microscopy, we first characterized responses of layer 2/3 (L2/3) neurons to sounds, visual stimuli, and running onsets and found responses to all three stimuli. Probing responses evoked by audiomotor (AM) mismatches, we found that they closely resemble visuomotor (VM) mismatch responses in visual cortex (V1). Finally, testing for cross modal influence on AM mismatch responses by coupling both sound amplitude and visual flow speed to the speed of running, we found that AM mismatch responses were amplified when paired with concurrent VM mismatches. Our results demonstrate that multimodal and non-hierarchical interactions shape prediction error responses in cortical L2/3.