Direct neural pathways convey distinct visual information to Drosophila mushroom bodies

  1. Katrin Vogt
  2. Yoshinori Aso
  3. Toshihide Hige
  4. Stephan Knapek
  5. Toshiharu Ichinose
  6. Anja B Friedrich
  7. Glenn C Turner
  8. Gerald M Rubin
  9. Hiromu Tanimoto  Is a corresponding author
  1. Harvard University, United States
  2. Janelia Research Campus, Howard Hughes Medical Institute, United States
  3. Max-Planck Institut für Neurobiologie, Germany

Abstract

Previously, we identified that visual and olfactory associative memories of Drosophila share the mushroom body (MB) circuits (Vogt et al. 2014). Despite well-characterized odor representations in the Drosophila MB, the MB circuit for visual information is totally unknown. Here we show that a small subset of MB Kenyon cells (KCs) selectively responds to visual but not olfactory stimulation. The dendrites of these atypical KCs form a ventral accessory calyx (vAC), distinct from the main calyx that receives olfactory input. We identified two types of visual projection neurons (VPNs) directly connecting the optic lobes and the vAC. Strikingly, these VPNs are differentially required for visual memories of color and brightness. The segregation of visual and olfactory domains in the MB allows independent processing of distinct sensory memories and may be a conserved form of sensory representations among insects.

Article and author information

Author details

  1. Katrin Vogt

    Center for Brain Science, Harvard University, Cambridge, United States
    Competing interests
    The authors declare that no competing interests exist.
  2. Yoshinori Aso

    Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
    Competing interests
    The authors declare that no competing interests exist.
  3. Toshihide Hige

    Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
    Competing interests
    The authors declare that no competing interests exist.
  4. Stephan Knapek

    Max-Planck Institut für Neurobiologie, Martinsried, Germany
    Competing interests
    The authors declare that no competing interests exist.
  5. Toshiharu Ichinose

    Max-Planck Institut für Neurobiologie, Martinsried, Germany
    Competing interests
    The authors declare that no competing interests exist.
  6. Anja B Friedrich

    Max-Planck Institut für Neurobiologie, Martinsried, Germany
    Competing interests
    The authors declare that no competing interests exist.
  7. Glenn C Turner

    Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
    Competing interests
    The authors declare that no competing interests exist.
  8. Gerald M Rubin

    Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, United States
    Competing interests
    The authors declare that no competing interests exist.
  9. Hiromu Tanimoto

    Max-Planck Institut für Neurobiologie, Martinsried, Germany
    For correspondence
    hiromut@m.tohoku.ac.jp
    Competing interests
    The authors declare that no competing interests exist.

Copyright

© 2016, Vogt et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 5,367
    views
  • 1,327
    downloads
  • 134
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Katrin Vogt
  2. Yoshinori Aso
  3. Toshihide Hige
  4. Stephan Knapek
  5. Toshiharu Ichinose
  6. Anja B Friedrich
  7. Glenn C Turner
  8. Gerald M Rubin
  9. Hiromu Tanimoto
(2016)
Direct neural pathways convey distinct visual information to Drosophila mushroom bodies
eLife 5:e14009.
https://doi.org/10.7554/eLife.14009

Share this article

https://doi.org/10.7554/eLife.14009

Further reading

    1. Neuroscience
    Magdalena Solyga, Georg B Keller
    Research Article

    Our movements result in predictable sensory feedback that is often multimodal. Based on deviations between predictions and actual sensory input, primary sensory areas of cortex have been shown to compute sensorimotor prediction errors. How prediction errors in one sensory modality influence the computation of prediction errors in another modality is still unclear. To investigate multimodal prediction errors in mouse auditory cortex, we used a virtual environment to experimentally couple running to both self-generated auditory and visual feedback. Using two-photon microscopy, we first characterized responses of layer 2/3 (L2/3) neurons to sounds, visual stimuli, and running onsets and found responses to all three stimuli. Probing responses evoked by audiomotor (AM) mismatches, we found that they closely resemble visuomotor (VM) mismatch responses in visual cortex (V1). Finally, testing for cross modal influence on AM mismatch responses by coupling both sound amplitude and visual flow speed to the speed of running, we found that AM mismatch responses were amplified when paired with concurrent VM mismatches. Our results demonstrate that multimodal and non-hierarchical interactions shape prediction error responses in cortical L2/3.

    1. Neuroscience
    Moritz F Wurm, Doruk Yiğit Erigüç
    Research Article

    Recognizing goal-directed actions is a computationally challenging task, requiring not only the visual analysis of body movements, but also analysis of how these movements causally impact, and thereby induce a change in, those objects targeted by an action. We tested the hypothesis that the analysis of body movements and the effects they induce relies on distinct neural representations in superior and anterior inferior parietal lobe (SPL and aIPL). In four fMRI sessions, participants observed videos of actions (e.g. breaking stick, squashing plastic bottle) along with corresponding point-light-display (PLD) stick figures, pantomimes, and abstract animations of agent–object interactions (e.g. dividing or compressing a circle). Cross-decoding between actions and animations revealed that aIPL encodes abstract representations of action effect structures independent of motion and object identity. By contrast, cross-decoding between actions and PLDs revealed that SPL is disproportionally tuned to body movements independent of visible interactions with objects. Lateral occipitotemporal cortex (LOTC) was sensitive to both action effects and body movements. These results demonstrate that parietal cortex and LOTC are tuned to physical action features, such as how body parts move in space relative to each other and how body parts interact with objects to induce a change (e.g. in position or shape/configuration). The high level of abstraction revealed by cross-decoding suggests a general neural code supporting mechanical reasoning about how entities interact with, and have effects on, each other.