New genetic tools for mushroom body output neurons in Drosophila
eLife assessment
This work advances on two Aso et al 2014 eLife papers to describe further resources that are valuable for the field. This paper identified and contributes additional MBON split-Gal4s, convincingly describing their anatomy, connectivity and function.
https://doi.org/10.7554/eLife.90523.3.sa0Valuable: Findings that have theoretical or practical implications for a subfield
- Landmark
- Fundamental
- Important
- Valuable
- Useful
Convincing: Appropriate and validated methodology in line with current state-of-the-art
- Exceptional
- Compelling
- Convincing
- Solid
- Incomplete
- Inadequate
During the peer-review process the editor and reviewers write an eLife Assessment that summarises the significance of the findings reported in the article (on a scale ranging from landmark to useful) and the strength of the evidence (on a scale ranging from exceptional to inadequate). Learn more about eLife Assessments
Abstract
How memories of past events influence behavior is a key question in neuroscience. The major associative learning center in Drosophila, the mushroom body (MB), communicates to the rest of the brain through mushroom body output neurons (MBONs). While 21 MBON cell types have their dendrites confined to small compartments of the MB lobes, analysis of EM connectomes revealed the presence of an additional 14 MBON cell types that are atypical in having dendritic input both within the MB lobes and in adjacent brain regions. Genetic reagents for manipulating atypical MBONs and experimental data on their functions have been lacking. In this report we describe new cell-type-specific GAL4 drivers for many MBONs, including the majority of atypical MBONs that extend the collection of MBON driver lines we have previously generated (Aso et al., 2014a; Aso et al., 2016; Aso et al., 2019). Using these genetic reagents, we conducted optogenetic activation screening to examine their ability to drive behaviors and learning. These reagents provide important new tools for the study of complex behaviors in Drosophila.
Introduction
The mushroom body (MB) is the major site of associative learning in insects (reviewed in Heisenberg, 2003; Modi et al., 2020). In the MB of each Drosophila brain hemisphere, multiple modalities of sensory stimuli are represented by the sparse activity of ~2000 Kenyon cells (KCs) whose parallel axonal fibers form the MB lobes. The lobes are further divided into compartments by the innervation patterns of dopaminergic neuron (DAN) axons and mushroom body output neuron (MBON) dendrites. MBONs provide the convergence element of the MB’s three-layer divergent-convergent circuit architecture and the outputs of the MBONs drive learned behaviors.
Whereas the dendrites of typical MBONs are strictly confined to the MB lobes, analysis of the Drosophila connectome in larvae (Eichler et al., 2017) and adults (Li et al., 2020; Scheffer et al., 2020) revealed a new class of ‘atypical’ MBONs, consisting of 14 cell types in adults, that have part of their dendritic arbors outside the MB lobes, allowing them to integrate input from KCs with other information (Li et al., 2020). Some atypical MBONs receive dendritic input from other MBONs. Several provide output onto DANs that innervate the MB to participate in recurrent networks. At least five make strong, direct synaptic contact onto descending neurons that drive motor action. Three provide strong direct connections to tangential neurons of the fan-shaped body of the central complex (CX). However, analysis of the behaviors mediated by atypical MBONs has been limited by the lack of genetic drivers needed to manipulate their activity.
Here, we report the generation and characterization of cell-type-specific split-GAL4 driver lines for the majority of the atypical MBONs. We also provide driver lines for two typical MBON types for which cell-type-specific split-GAL4 drivers were not previously available, and improved drivers for several other MBONs. We demonstrate the use of these new split-GAL4 lines in two behavioral assays. Using a four-armed olfactory arena equipped with optogenetic LED panels (Aso and Rubin, 2016; Pettersson, 1970), we assessed the ability of the labeled neurons to act as the unconditioned stimulus in an olfactory learning assay, an indication of their regulation of the activity of DANs. We also measured the effects of their optogenetic activation on kinematic parameters relevant for olfactory navigation. These reagents provide important new tools for the study of complex behaviors in Drosophila.
Results and discussion
Generation and characterization of split-GAL4 lines for MBONs
We generated split-GAL4 genetic driver lines corresponding to MBON cell type using well-established methods (Dionne et al., 2018; Luan et al., 2006; Pfeiffer et al., 2010; Tirian and Dickson, 2017). The morphologies of the MBONs, produced by electron microscopic reconstruction, were used to search databases of light microscopic images to identify enhancers whose expression patterns might yield clean driver lines for that MBON when intersected (Otsuna et al., 2018; Meissner et al., 2023). We took advantage of an expanded set of starting reagents that were not available when Aso et al., 2014a, generated the original set of split-GAL4 drivers for the MB cell types; in addition to the ~7000 GAL4 expression patterns described in Jenett et al., 2012, we had access to an additional ~9000 GAL4 expression patterns (Tirian and Dickson, 2017). A total of approximately 600 intersections were experimentally tested to generate the split-GAL4 lines reported here.
Figure 1 shows examples of expression patterns of some of the highest quality split-GAL4 lines. For many cell types we were able to identify multiple different split-GAL4 lines. The brain and ventral nerve cord expression patterns of all split-GAL4 lines are shown in Figure 1—figure supplement 1 for atypical MBONs and Figure 1—figure supplement 2 for typical MBONs. The original confocal stacks from which these figures were generated, as well as additional image data, are available for download at https://splitgal4.janelia.org. Videos 1 and 2 provide examples of comparisons between light microscopic images from these lines and neuronal skeletons from the hemibrain dataset (Scheffer et al., 2020) that were used to confirm the assignment of cell-type identity. Figure 1—figure supplement 3 summarizes what we consider to be the best available split-GAL4 lines for each of the MBON types identified by connectomics, based on a combination of the lines presented here and in previous studies. The expression patterns shown in this paper were obtained using an antibody against GFP which visualizes expression from 20xUAS-CsChrimson-mVenus in attP18. Directly visualizing the optogenetic effector is important since expression intensity, the number of labeled MBONs, and off-targeted expression can differ when other UAS-reporter/effectors are used (e.g. see Figure 2—figure supplement 1 of Aso et al., 2014a).
For typical MBONs, we provide split-GAL4 lines for two cell types for which drivers were not described in Aso et al., 2014a, MBON21 and MBON23. We also provide improved lines for MBON04 and MBON19; previous drivers for MBON04 also had expression in other MBON cell types and our new line for MBON19 has less off-target expression (see Aso et al., 2014a, for previous lines).
For atypical MBONs we were able to generate multiple, independent driver lines for MBON20, MBON29, MBON30, and MBON33 that provide strong and specific expression. Several lines for MBON31 were generated, but they are also stochastically expressed in MBON32. Lines for MBON26 and MBON35 have some off-target expression that might compromise their use in behavioral assays, but they should permit cell-type-specific imaging studies. We failed to generate lines for MBON24, MBON25, MBON27, MBON32, and MBON34. We identified two candidate lines for MBON28 but because the morphology of MBON28 is very similar to those of MBON16 and MBON17 we were not able to make a definitive cell-type assignment (see Figure 1—figure supplement 3).
Activation phenotypes of MBON lines
MBONs are the first layer of neurons that transform memories stored inside the MB into actions. MBONs can also provide input to the dendrites of the DANs that innervate the MB lobes, forming a recurrent network. To investigate these two aspects of MBON function, we used a four-armed olfactory arena equipped with LED panels for optogenetics (Figure 2A). A group of ~20 starved flies that express CsChrimson in a particular MBON cell type was subjected to a series of optogenetic olfactory conditioning and memory tests, and then to six trials of activation in the absence of odors but with airflow (Figure 2B). Using the same setup and similar protocols, we have previously characterized the dynamics of memories formed by optogenetically activating DAN cell types innervating different MB compartments (Aso and Rubin, 2016) and analyzed circuits downstream of the MBONs that promote upwind locomotion (Aso et al., 2023). Figure 2C displays the results of these behavioral experiments sorted by the mean memory score of each cell type in the final memory test. When possible, we ran multiple independent split-GAL4 lines for the same MBON cell type. The concurrence of phenotypes in these cases provides further evidence that the observed effects are due to the activation of that cell type rather than off-target expression or genetic background effects.
Training flies by paring odor presentation with activation of MBON21, using either of the two lines tested (SS81853 and SS81521), resulted in robust aversive memory (Figure 2C) as previously observed with another driver for this cell type that displayed weaker expression (see SS46348 in Figure 1—figure supplement 2; Otto et al., 2020). Both driver lines for the atypical MBON29 similarly induced aversive memory. These MBONs are both cholinergic, have dendrites in the γ4 and γ5 compartments, and synapse onto the dendrites of DANs that respond to punitive stimuli such as PAM-γ3 and PPL1-γ1pedc (Figure 2C, Figure 2—figure supplement 1; Li et al., 2020; Otto et al., 2020). In contrast, training flies with activation of MBON33 induced appetitive memory. MBON33 is also cholinergic but preferentially connects with octopaminergic neurons and reward-representing DANs. We noticed that confocal microscopy images of MBON33 visualized by split-GAL4 contain additional branches around the esophagus (Figure 1H), an area which was outside of EM hemibrain volume. Since octopaminergic neurons arborize in this area, the connection between MBON33 and octopaminergic neurons might be more extensive than the previously described using the hemibrain data (Busch et al., 2009; Li et al., 2020).
To explore kinematic parameters controlled by MBONs, we tracked the trajectories of individual flies during and after a 10 s optogenetic stimulus (Figure 2C). Although we used the same flies that went through the optogenetic olfactory conditioning, the activation phenotypes of positive control lines could be observed and was not compromised by their prior activation (Figure 2—figure supplement 2). In these assays we observed some variability between lines for the same cell type, presumably due to difference in expression level or off-targeted expression. Nevertheless, the two lines for MBON21 showed similar patterns of kinematic variables: a low walking speed in the presence of the red activation light, a stimulus that caused elevated locomotion in genetic control flies, and then orientation toward upwind when the optogenetic stimulus concluded (Figures 2C and 3A–D). Similar phenotypes were observed with a driver for a combination of the three glutamatergic MBON01, MBON03, and MBON04 (MB011B; Figure 2C). Despite their common anatomical features and memory phenotypes, MBON21 and MBON29 modulated distinct motor parameters. Neither of the two lines for MBON29 changed walking speed or orientation toward upwind when activated, but they both increased angular motion at the onset of activation, similar to the three lines for MBON26 (Figure 3E–G).
Finally, we asked if the MBON21, MBON29, and MBON33 lines that were able to serve as the unconditioned stimulus in memory formation also drove corresponding avoidance or attraction. Previous studies and the results shown in Figure 2C indicated that these are not always shared phenotypes; for example, the set of glutamatergic MBONs in MB011B whose dendrites lie in the γ5 and β′2 compartments can drive downwind locomotion and avoidance behaviors (Aso et al., 2023; Aso et al., 2014b; Matheson et al., 2022) but do not induce aversive memory. We tested if flies expressing CsChrimson in each of these three MBON cell types prefer quadrants of the arena with red activating light during the first and second 30 s test periods (Figure 3H1). When CsChrimson is expressed in MBON21 or MBON29, flies avoid illuminated quadrants of the arena. Conversely, CsChrismon activation in a line for MBON33 promoted attraction to the illuminated quadrants, although this effect was observed only at the first test period. Thus, in the case of these three MBON cell types, memory formation and avoidance/attraction behaviors are correlated.
Concluding remarks
We generated and anatomically characterized an improved set of genetic driver lines for MBONs and provide the first driver lines for atypical MBONs. We expect these lines to be useful in a wide range of studies of fly behavior. We demonstrate the suitability of these lines for behavioral analyses by showing that multiple independent lines for the same cell type gave consistent results when assayed for their ability to serve as the unconditioned stimulus in memory formation and to produce appetitive or aversive movements. MBON21, MBON29, and MBON33, characterized in this study, have distinct features compared to well-studied MBONs that innervate the same compartments. MBON21 and MBON29 form cholinergic connections to the dendrites of DANs known to respond to punishment, whereas other MBONs from the same compartments form glutamatergic connections with reward-representing DANs (Figure 4A).
While most of sensory input to the MB is olfactory, the connectome revealed that two specific classes of KCs receive predominantly visual input. MBON19 provides the major output from one of these classes, KC α/βp, and about half of MBON19’s sensory input is estimated to be visual. MBON33 provides a major output from the other class of visual KCs, KC γd, with more than half of the sensory input to its dendrites in the MB estimated to be visual. The cell-type-specific driver lines we provide for MBON19 and MBON33 should facilitate studies of the behavioral roles of the two streams of visual information that pass through the MB.
The MB and the CX are thought to play key roles in the most cognitive processes that the fly can perform including learning, memory, and spatial navigation. Two of the MBONs for which we generated cell-type-specific driver lines, MBON21 and MBON30, provide the strongest direct inputs to the CX from the MB (Figure 4B), while MBON30 receives over 3% of its input (450 synapses) from the CX cell type FR1 (aka FB2-5RUB). The genetic reagents we report here should advance studies of reinforcement signals in parallel memory systems, the role of visual inputs to the MB, and information flow from the MB to the CX.
Materials and methods
Flies
Split-GAL4 lines were created as previously described (Dionne et al., 2018). Flies were reared on standard cornmeal molasses food at 21–22°C and 50% humidity. For optogenetic activation experiments, flies were reared in the dark on standard food supplemented with retinal (Sigma-Aldrich, St. Louis, MO, USA) unless otherwise specified, 0.2 mM all trans-retinal prior to eclosion and 0.4 mM all trans-retinal post eclosion. Female flies were sorted on cold plates and kept in retinal food vials for at least 1 day prior to be transferred to agar vials for 48–72 hr of starvation. Flies were 4–10 days of age at the time of behavioral experiments. Images of the original confocal stacks for each of the lines are available at https://splitgal4.janelia.org.
The correspondence between the morphologies of EM skeletons and light microscopic images of GAL4 driver line expression patterns was used to assign GAL4 lines to particular cell types. This can be done with confidence when there are not multiple cell types with very similar morphology. However, in the case of MBON28 we were not able to make a definitive assignment because of the similarity in the morphologies of MBON16, MBON17, and MBON28.
Immunohistochemistry and imaging
Request a detailed protocolDissection and immunohistochemistry of fly brains were carried out as previously described (Aso et al., 2014a). Each split-GAL4 line was crossed to the same Chrimson effector used for behavioral analysis. Full step-by-step protocols can be found at https://www.janelia.org/project-team/flylight/protocols. For single-cell labeling of neurons from selected split-Gal4 lines, we used the MultiColor FlpOut (MCFO) technique (Nern et al., 2015). Video 1 and Video 2 were produced using VVD Viewer (https://github.com/JaneliaSciComp/VVDViewer; copy archived at JaneliaSciComp, 2024) to generate a video comparing light and EM data (hemibrain v1.2) for each cell type. Individual videos were then concatenated, and text added using Adobe Premiere Pro.
Optogenetics and olfactory learning assays
Request a detailed protocolGroups of approximately 20 female flies were trained and tested at 25°C at 50% relative humidity in the fully automated olfactory arena for optogenetics experiments as previously described (Aso and Rubin, 2016; Pettersson, 1970; Vet et al., 1983). The 627 nm peak LED light was used at 22.7 µW/mm2. The odors were diluted in paraffin oil (Sigma-Aldrich): pentyl acetate (1:10,000) and ethyl lactate (1:10,000). Videography was performed at 30 frames per second with 850 nm LED backlight with 820 nm longpass filter and analyzed using the Flytracker (Wilson et al., 2013) and Fiji (Schindelin et al., 2012).
Statistics
Statistical comparisons were performed using the Kruskal-Wallis test followed by Dunn’s post-test for multiple comparison (Prism; GraphPad Inc, La Jolla, CA, USA). Appropriate sample size for olfactory learning experiment was estimated based on the standard deviation of performance index in previous study using the same assay (Aso and Rubin, 2016).
Connectomics
Request a detailed protocolInformation on connection strengths are taken from neuprint.janelia.org (hemibrain v1.2.1).
Data availability
Request a detailed protocolThe confocal images of expression patterns are available online (http://www.janelia.org/split-gal4). The source data for figures are included in the manuscript.
Appendix 1
Data availability
The confocal images of expression patterns are available online (http://www.janelia.org/split-gal4). Figure 2 - Source data 1 and Figure 3-source data 1 contains the numerical data used to generate figures.
References
-
A map of octopaminergic neurons in the Drosophila brainThe Journal of Comparative Neurology 513:643–667.https://doi.org/10.1002/cne.21966
-
Mushroom body memoir: from maps to modelsNature Reviews. Neuroscience 4:266–275.https://doi.org/10.1038/nrn1074
-
Independent optical excitation of distinct neural populationsNature Methods 11:338–346.https://doi.org/10.1038/nmeth.2836
-
Neural correlates of water reward in thirsty DrosophilaNature Neuroscience 17:1536–1542.https://doi.org/10.1038/nn.3827
-
A neural circuit for wind-guided olfactory navigationNature Communications 13:4613.https://doi.org/10.1038/s41467-022-32247-7
-
The Mushroom body: from architecture to algorithm in a learning circuitAnnual Review of Neuroscience 43:465–484.
-
An Aphid Sex AttractantInsect Systematics & Evolution 1:63–73.https://doi.org/10.1163/187631270X00357
-
Fiji: an open-source platform for biological-image analysisNature Methods 9:676–682.https://doi.org/10.1038/nmeth.2019
-
FluoRender: An Application of 2D Image Space Methods for 3D and 4D Confocal Microscopy Data Visualization in Neurobiology ResearchIEEE Pacific Visualization Symposium pp. 201–208.https://doi.org/10.1109/pacificvis.2012.6183592
-
BookComputer analysis of images and patternsIn: Wilson R, Hancock E, editors. Tracking for Quantifying Social Network of Drosophila Melanogaster. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Ku Leuven. pp. 539–545.https://doi.org/10.1007/978-3-642-40246-3
-
Suppression of dopamine neurons mediates rewardPLOS Biology 14:e1002586.https://doi.org/10.1371/journal.pbio.1002586
-
NeuTu: Software for Collaborative, Large-Scale, Segmentation-Based Connectome ReconstructionFrontiers in Neural Circuits 12:101.https://doi.org/10.3389/fncir.2018.00101
Article and author information
Author details
Funding
Howard Hughes Medical Institute
- Gerald M Rubin
- Yoshinori Aso
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Acknowledgements
We thank the Janelia Fly Facility for help with fly husbandry and the FlyLight Project Team for dissection, histological preparation, and imaging of nervous systems. Marisa Dreher (Dreher Design Studios, Inc) assembled the videos and helped with figure design. Claire Managan segmented the neuron morphologies shown in the videos. Masayoshi Ito helped identify lines to use in intersections. Daisuke Hattori, Glenn Turner, Vivek Jayaraman, Toshihide Hige, and Yichun Shuai provided comments and suggestions on the early version of the manuscript.
Version history
- Preprint posted:
- Sent for peer review:
- Reviewed Preprint version 1:
- Reviewed Preprint version 2:
- Version of Record published:
Cite all versions
You can cite all versions using the DOI https://doi.org/10.7554/eLife.90523. This DOI represents all versions, and will always resolve to the latest one.
Copyright
© 2023, Rubin and Aso
This article is distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 2,297
- views
-
- 179
- downloads
-
- 9
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
Relatively little is known about the way vision is used to guide locomotion in the natural world. What visual features are used to choose paths in natural complex terrain? To answer this question, we measured eye and body movements while participants walked in natural outdoor environments. We incorporated measurements of the three-dimensional (3D) terrain structure into our analyses and reconstructed the terrain along the walker’s path, applying photogrammetry techniques to the eye tracker’s scene camera videos. Combining these reconstructions with the walker’s body movements, we demonstrate that walkers take terrain structure into account when selecting paths through an environment. We find that they change direction to avoid taking steeper steps that involve large height changes, instead of choosing more circuitous, relatively flat paths. Our data suggest walkers plan the location of individual footholds and plan ahead to select flatter paths. These results provide evidence that locomotor behavior in natural environments is controlled by decision mechanisms that account for multiple factors, including sensory and motor information, costs, and path planning.
-
- Computational and Systems Biology
- Neuroscience
Perception can be highly dependent on stimulus context, but whether and how sensory areas encode the context remains uncertain. We used an ambiguous auditory stimulus – a tritone pair – to investigate the neural activity associated with a preceding contextual stimulus that strongly influenced the tritone pair’s perception: either as an ascending or a descending step in pitch. We recorded single-unit responses from a population of auditory cortical cells in awake ferrets listening to the tritone pairs preceded by the contextual stimulus. We find that the responses adapt locally to the contextual stimulus, consistent with human MEG recordings from the auditory cortex under the same conditions. Decoding the population responses demonstrates that cells responding to pitch-changes are able to predict well the context-sensitive percept of the tritone pairs. Conversely, decoding the individual pitch representations and taking their distance in the circular Shepard tone space predicts the opposite of the percept. The various percepts can be readily captured and explained by a neural model of cortical activity based on populations of adapting, pitch and pitch-direction cells, aligned with the neurophysiological responses. Together, these decoding and model results suggest that contextual influences on perception may well be already encoded at the level of the primary sensory cortices, reflecting basic neural response properties commonly found in these areas.