Identifying neural substrates of behavior requires defining actions in terms that map onto brain activity. Brain and muscle activity naturally correlate via the output of motor neurons, but apart from simple movements it has been difficult to define behavior in terms of muscle contractions. By mapping the musculature of the pupal fruit fly and comprehensively imaging muscle activation at single cell resolution, we here describe a multiphasic behavioral sequence in Drosophila. Our characterization identifies a previously undescribed behavioral phase and permits extraction of major movements by a convolutional neural network. We deconstruct movements into a syllabary of co-active muscles and identify specific syllables that are sensitive to neuromodulatory manipulations. We find that muscle activity shows considerable variability, with sequential increases in stereotypy dependent upon neuromodulation. Our work provides a platform for studying whole-animal behavior, quantifying its variability across multiple spatiotemporal scales, and analyzing its neuromodulatory regulation at cellular resolution.
The source data for the figures and tables in this study are available at figshare (https://figshare.com/collections/Pupal_behavior_emerges_from_unstructured_muscle_activity_in_response_to_neuromodulation_in_Drosophila/5489637) and computer code is posted to https://github.com/BenjaminHWhite.
Pupal behavior emerges from unstructured muscle activity in response to neuromodulation in Drosophiladoi.org/10.6084/m9.figshare.c.5489637.
- Amicia D Elliott
- Benjamin H White
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
- Chris Q Doe, Howard Hughes Medical Institute, University of Oregon, United States
This is an open-access article, free of all copyright, and may be freely reproduced, distributed, transmitted, modified, built upon, or otherwise used by anyone for any lawful purpose. The work is made available under the Creative Commons CC0 public domain dedication.
Neural activity in the mammalian cortex has been studied extensively during decision tasks, and recent work aims to identify under what conditions cortex is actually necessary for these tasks. We discovered that mice with distinct cognitive experiences, beyond sensory and motor learning, use different cortical areas and neural activity patterns to solve the same navigation decision task, revealing past learning as a critical determinant of whether cortex is necessary for goal-directed navigation. We used optogenetics and calcium imaging to study the necessity and neural activity of multiple cortical areas in mice with different training histories. Posterior parietal cortex and retrosplenial cortex were mostly dispensable for accurate performance of a simple navigation task. In contrast, these areas were essential for the same simple task when mice were previously trained on complex tasks with delay periods or association switches. Multiarea calcium imaging showed that, in mice with complex-task experience, single-neuron activity had higher selectivity and neuron–neuron correlations were weaker, leading to codes with higher task information. Therefore, past experience is a key factor in determining whether cortical areas have a causal role in goal-directed navigation.
During flight maneuvers, insects exhibit compensatory head movements which are essential for stabilizing the visual field on their retina, reducing motion blur, and supporting visual self-motion estimation. In Diptera, such head movements are mediated via visual feedback from their compound eyes that detect retinal slip, as well as rapid mechanosensory feedback from their halteres – the modified hindwings that sense the angular rates of body rotations. Because non-Dipteran insects lack halteres, it is not known if mechanosensory feedback about body rotations plays any role in their head stabilization response. Diverse non-Dipteran insects are known to rely on visual and antennal mechanosensory feedback for flight control. In hawkmoths, for instance, reduction of antennal mechanosensory feedback severely compromises their ability to control flight. Similarly, when the head movements of freely flying moths are restricted, their flight ability is also severely impaired. The role of compensatory head movements as well as multimodal feedback in insect flight raises an interesting question: in insects that lack halteres, what sensory cues are required for head stabilization? Here, we show that in the nocturnal hawkmoth Daphnis nerii, compensatory head movements are mediated by combined visual and antennal mechanosensory feedback. We subjected tethered moths to open-loop body roll rotations under different lighting conditions, and measured their ability to maintain head angle in the presence or absence of antennal mechanosensory feedback. Our study suggests that head stabilization in moths is mediated primarily by visual feedback during roll movements at lower frequencies, whereas antennal mechanosensory feedback is required when roll occurs at higher frequency. These findings are consistent with the hypothesis that control of head angle results from a multimodal feedback loop that integrates both visual and antennal mechanosensory feedback, albeit at different latencies. At adequate light levels, visual feedback is sufficient for head stabilization primarily at low frequencies of body roll. However, under dark conditions, antennal mechanosensory feedback is essential for the control of head movements at high frequencies of body roll.