Nested mechanosensory feedback actively damps visually guided head movements in Drosophila
Abstract
Executing agile locomotion requires animals to integrate sensory feedback, often from multiple sources. For example, human gaze is mediated by multiple feedback loops that integrate visual and vestibular information. A central challenge in studying biological feedback loops is that they are nested and dynamically coupled. Here, we develop a framework based on control theory for unraveling nested feedback systems and apply it to study gaze stabilization in the fruit fly (Drosophila). By combining experimental and mathematical methods to manipulate control topologies, we uncovered the role of body-generated mechanosensory feedback nested within visual feedback in the control of head movements. We discovered that visual feedback changed the tuning of head movements across visual motion frequencies whereas mechanosensory feedback damped head movements. Head saccades had slower dynamics when the body was free to move, further pointing to the role of damping via mechanosensory feedback. By comparing head responses between self-generated and externally generated body motion, we revealed a nonlinear gating of mechanosensory feedback that is motor-context dependent. Altogether, our findings reveal the role of nested feedback loops in flies and uncover mechanisms that reconcile differences in head kinematics between body-free and body-fixed flies. Our framework is generalizable to biological and robotic systems relying on nested feedback control for guiding locomotion.
Data availability
All code and data is available on Penn State ScholarSphere at this link: https://scholarsphere.psu.edu/resources/7af9b459-4be2-4347-bcb4-6db34cb9cc7e
Article and author information
Author details
Funding
Air Force Office of Scientific Research (FA9550-20-1-0084)
- Jean-Michel Mongeau
Alfred P. Sloan Foundation
- Jean-Michel Mongeau
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Copyright
© 2022, Cellini & Mongeau
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 1,206
- views
-
- 184
- downloads
-
- 15
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
Relatively little is known about the way vision is used to guide locomotion in the natural world. What visual features are used to choose paths in natural complex terrain? To answer this question, we measured eye and body movements while participants walked in natural outdoor environments. We incorporated measurements of the three-dimensional (3D) terrain structure into our analyses and reconstructed the terrain along the walker’s path, applying photogrammetry techniques to the eye tracker’s scene camera videos. Combining these reconstructions with the walker’s body movements, we demonstrate that walkers take terrain structure into account when selecting paths through an environment. We find that they change direction to avoid taking steeper steps that involve large height changes, instead of choosing more circuitous, relatively flat paths. Our data suggest walkers plan the location of individual footholds and plan ahead to select flatter paths. These results provide evidence that locomotor behavior in natural environments is controlled by decision mechanisms that account for multiple factors, including sensory and motor information, costs, and path planning.
-
- Neuroscience
Movements are performed by motoneurons transforming synaptic inputs into an activation signal that controls muscle force. The control signal emerges from interactions between ionotropic and neuromodulatory inputs to motoneurons. Critically, these interactions vary across motoneuron pools and differ between muscles. To provide the most comprehensive framework to date of motor unit activity during isometric contractions, we identified the firing activity of extensive samples of motor units in the tibialis anterior (129 ± 44 per participant; n=8) and the vastus lateralis (130 ± 63 per participant; n=8) muscles during isometric contractions of up to 80% of maximal force. From this unique dataset, the rate coding of each motor unit was characterised as the relation between its instantaneous firing rate and the applied force, with the assumption that the linear increase in isometric force reflects a proportional increase in the net synaptic excitatory inputs received by the motoneuron. This relation was characterised with a natural logarithm function that comprised two stages. The initial stage was marked by a steep acceleration of firing rate, which was greater for low- than medium- and high-threshold motor units. The second stage comprised a linear increase in firing rate, which was greater for high- than medium- and low-threshold motor units. Changes in firing rate were largely non-linear during the ramp-up and ramp-down phases of the task, but with significant prolonged firing activity only evident for medium-threshold motor units. Contrary to what is usually assumed, our results demonstrate that the firing rate of each motor unit can follow a large variety of trends with force across the pool. From a neural control perspective, these findings indicate how motor unit pools use gain control to transform inputs with limited bandwidths into an intended muscle force.