Circuits for integrating learned and innate valences in the insect brain
Abstract
Animal behavior is shaped both by evolution and by individual experience. Parallel brain pathways encode innate and learned valences of cues, but the way in which they are integrated during action-selection is not well understood. We used electron microscopy to comprehensively map with synaptic resolution all neurons downstream of all Mushroom Body output neurons (encoding learned valences) and characterized their patterns of interaction with Lateral Horn neurons (encoding innate valences) in Drosophila larva. The connectome revealed multiple convergence neuron types that receive convergent Mushroom Body and Lateral Horn inputs. A subset of these receives excitatory input from positive-valence MB and LH pathways and inhibitory input from negative-valence MB pathways. We confirmed functional connectivity from LH and MB pathways and behavioral roles of two of these neurons. These neurons encode integrated odor value and bidirectionally regulate turning. Based on this we speculate that learning could potentially skew the balance of excitation and inhibition onto these neurons and thereby modulate turning. Together, our study provides insights into the circuits that integrate learned and innate to modify behavior.
Data availability
All data generated or analysed during this study are included in the manuscript and supporting files. Source data files are provided for Figures 2 to 6.
Article and author information
Author details
Funding
Howard Hughes Medical Institute
- Claire Eschbach
- Akira Fushiki
- Michael Winding
- Bruno Afonso
- Ingrid V Andrade
- Benjamin T Cocanougher
- Javier Valdes-Aleman
- James W Truman
- Albert Cardona
- Marta Zlatic
European Research Council (RG95162)
- Claire Eschbach
- Michael Winding
- Marta Zlatic
Wellcome Trust (RG86459)
- Marta Zlatic
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Copyright
© 2021, Eschbach et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 3,518
- views
-
- 548
- downloads
-
- 33
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
When observing others’ behaviors, we continuously integrate their movements with the corresponding sounds to enhance perception and develop adaptive responses. However, how the human brain integrates these complex audiovisual cues based on their natural temporal correspondence remains unclear. Using electroencephalogram (EEG), we demonstrated that rhythmic cortical activity tracked the hierarchical rhythmic structures in audiovisually congruent human walking movements and footstep sounds. Remarkably, the cortical tracking effects exhibit distinct multisensory integration modes at two temporal scales: an additive mode in a lower-order, narrower temporal integration window (step cycle) and a super-additive enhancement in a higher-order, broader temporal window (gait cycle). Furthermore, while neural responses at the lower-order timescale reflect a domain-general audiovisual integration process, cortical tracking at the higher-order timescale is exclusively engaged in the integration of biological motion cues. In addition, only this higher-order, domain-specific cortical tracking effect correlates with individuals’ autistic traits, highlighting its potential as a neural marker for autism spectrum disorder. These findings unveil the multifaceted mechanism whereby rhythmic cortical activity supports the multisensory integration of human motion, shedding light on how neural coding of hierarchical temporal structures orchestrates the processing of complex, natural stimuli across multiple timescales.
-
- Evolutionary Biology
- Neuroscience
The first complete 3D reconstruction of the compound eye of a minute wasp species sheds light on the nuts and bolts of size reduction.