Nested mechanosensory feedback actively damps visually guided head movements in Drosophila
Abstract
Executing agile locomotion requires animals to integrate sensory feedback, often from multiple sources. For example, human gaze is mediated by multiple feedback loops that integrate visual and vestibular information. A central challenge in studying biological feedback loops is that they are nested and dynamically coupled. Here, we develop a framework based on control theory for unraveling nested feedback systems and apply it to study gaze stabilization in the fruit fly (Drosophila). By combining experimental and mathematical methods to manipulate control topologies, we uncovered the role of body-generated mechanosensory feedback nested within visual feedback in the control of head movements. We discovered that visual feedback changed the tuning of head movements across visual motion frequencies whereas mechanosensory feedback damped head movements. Head saccades had slower dynamics when the body was free to move, further pointing to the role of damping via mechanosensory feedback. By comparing head responses between self-generated and externally generated body motion, we revealed a nonlinear gating of mechanosensory feedback that is motor-context dependent. Altogether, our findings reveal the role of nested feedback loops in flies and uncover mechanisms that reconcile differences in head kinematics between body-free and body-fixed flies. Our framework is generalizable to biological and robotic systems relying on nested feedback control for guiding locomotion.
Editor's evaluation
The manuscript makes an important contribution to feedback control in neural systems. The analysis and modeling together make a compelling case for a nested system, combining visual with mechanosensory feedback, for head and body control in the fruit fly. The experiments that support these results are compelling and well-executed and the strategies for dissecting and modeling feedback are valuable to the field, and broadly applicable to other neural control systems. This paper will reach a wide audience; researchers investigating biological control systems, visual feedback, and gaze stabilization will all be interested in these results.
https://doi.org/10.7554/eLife.80880.sa0Introduction
Animal locomotion and the associated neural computations are closed-loop (Cowan et al., 2014; Roth et al., 2014; Madhav and Cowan, 2020). During locomotion, sensory systems measure external and internal states. This sensory information is then processed by the brain to guide motor decisions and the resulting movement shapes sensory inputs, thus closing the loop. Visually active animals often integrate visual and mechanosensory information to guide movement through complex environments (Mongeau et al., 2021; Frye, 2010). For instance, hawk moths integrate information from visual and mechanosensory pathways when feeding from flowers moving in the wind (Roth et al., 2016), glass knifefish rely on a combination of visual and electrosensory feedback to regulate their position within a moving refuge (Sutton et al., 2016), and flies stabilize vision via antennal feedback (Fuller et al., 2014b). For many of these behaviors, the sensors—e.g. eyes and proboscis in hawk moths—measure the same information (e.g. flower motion) in parallel. This is fundamentally a parallel sensory fusion problem where the animal must weight information from parallel pathways (Figure 1A). The Kalman filter has been applied to biological and robotic systems to solve similar sensory fusion problems and determine the optimal weight for each sensor (Sun and Deng, 2004; Ernst and Banks, 2002).

Parallel and nested sensory fusion in biological systems.
(A) Control model of parallel sensory fusion. Multiple sensory systems, S1 and S2, measure an external reference state with respect to the system’s motion . The information measured by S1 and S2 is fused together in parallel by a neural controller to maintain equilibrium. The neural controller drives locomotion through the system’s biomechanics , which feeds back to shape future sensory inputs, thus closing the loop. (B) Control model of nested sensory fusion. Same as (A) but one of the sensory systems (S2) does not directly measure the external reference state . Instead the system state is directly fed to the neural controller (purple). Thus S2 is not involved with measuring external sensory states.
In contrast to parallel sensory fusion—where sensory information operates at the same level in the control hierarchy—sensory feedback is often nested within higher levels of control (Figure 1B, Hardcastle and Krapp, 2016; Mongeau et al., 2021). Consider the goal-directed task of visually navigating through a complex environment. Vision provides slower and higher level information for guidance whereas mechanosensory inputs due to self-motion—measured by the vestibular and somatosensory systems—influence rapid, lower level postural reflexes (Bent et al., 2004; Goldberg et al., 2012; Nakahira et al., 2021). In this context, mechanosensory feedback is nested within visual feedback because it is activated by visually guided locomotion, and thus it does not directly relate to the task goal (Mongeau et al., 2021). This sensorimotor organization is analogous to cascade control in engineering controller design, where there are inner feedback loops nested within outer loops (Krishnaswamy et al., 1990). While many prior studies have investigated how animals integrate sensory information from multiple pathways, the case where one sensory pathway is nested within another has received significantly less attention. How does the brain integrate nested sensory feedback for effective locomotion?
One exemplar sensorimotor system that includes nested mechanosensory feedback is the gaze stabilization reflex. Primates move their eyes and head in response to visual motion to stabilize gaze, termed the optokinetic response (OKR) (Land, 2019). Eye and head movements feedback to shape visual inputs, and measurements of head motion from the vestibular system feedback to keep the eyes steady with respect to the head—termed the vestibulo-ocular reflex (VOR) (Goldberg et al., 2012). Although both the OKR and the VOR are reflexive stabilization feedback loops, the VOR is nested within the OKR. Prior work showed that the OKR and VOR are inversely tuned: the OKR responds strongly to low frequencies and the VOR is tuned to higher frequencies (Schweigart et al., 1997; Barnes, 1993). However, the specific contributions of visual and nested mechanosensory feedback when the OKR and VOR are active together remains unclear. Intriguingly, the gaze stabilization response of flies shows close parallels to the primate visuomotor response, with similar feedback topology, making it an accessible model system to unravel the mechanisms underlying nested feedback control (Cellini et al., 2022; Elzinga et al., 2012).
Here, we studied nested sensorimotor feedback loops in fruit flies (Drosophila), with a specific focus on teasing apart the contributions of visual and nested mechanosensory feedback during gaze stabilization. The gaze stabilization reflex consists of multiple motor systems—the head and body—that operate in closed-loop with the goal of reducing optic flow across the retina and keeping gaze level (Figure 2A, Cellini et al., 2022; Cellini and Mongeau, 2020a). The halteres—gyroscope-like organs that encode body velocity by sensing gyroscopic forces and structure the timing of motor outputs in flies (Fraenkel, 1939; Nalbach and Hengstenberg, 1994; Dickerson et al., 2019)—also influence the control of head and body movements about all three rotational axes (Hardcastle and Krapp, 2016; Mureli and Fox, 2015; Mureli et al., 2017; Nalbach, 1993; Rauscher and Fox, 2021). When visual inputs activate the gaze stabilization reflex and drive a compensatory response of the head and body, the halteres presumably sense the resulting body velocity and provide mechanosensory information that further influences a fly’s visuomotor behavior. Thus it would follow that mechanosensory feedback is inherently nested within visual feedback. Studying gaze stabilization in flies can therefore provide insights into how nested feedback loops interconnect and shape higher level loops in animal locomotion. Established experimental paradigms for studying fly flight provide a unique opportunity to manipulate control topologies, allowing us to break feedback loops and tease out the role of visual and nested mechanosensory feedback. In contrast to prior work that studied the parallel integration of visual and haltere information in open-loop—where flies had their head and body fixed in place (Dickinson, 1999; Sherman and Dickinson, 2003; Sherman and Dickinson, 2004)—here we employ an experimental paradigm that allowed flies to freely move their head and body in closed-loop (Figure 2B). In conjunction with empirical data, we synthesized a control model for mathematically teasing apart the role of nested sensory feedback. We applied this model to study how body-generated visual and nested mechanosensory feedback are integrated during the control of head movements. Our results provide new insights into how nested sensory feedback may be structured across phyla for gaze stabilization.

Control model of visual and nested mechanosensory feedback during gaze stabilization in fly flight.
(A) The control framework used to model and analyze the gaze stabilization system in body-free flies. Flies respond to an external visual perturbation by attempting to minimize the sensory visual error measured by their visual system. Neural control circuits in the brain for the head and body process the sensory error and send motor control signals to the corresponding biomechanical systems and to generate head and body movements. The fly’s gaze is controlled by the sum of head and body movements, which feeds back to shape the sensory error entering the visual system. Flies also measure mechanosensory information associated with body motion via the halteres, which is processed in the brain by analogous controllers for the head and body , and also contributes to shaping head and body responses. In this paradigm, mechanosensory feedback is nested within visual feedback. (B) The magnetic tether experimental paradigm for body-free flies corresponding to (A). A fly is tethered to a pin which is placed in a low-friction bearing and suspended in a magnetic field, allowing free rotation about the vertical (yaw) axis. (C) Same as (A) but for a body-fixed fly. Note that contributions of body visual feedback and nested mechanosensory feedback due to body motion are no longer present. The fly’s gaze is now purely determined by head movements. (D) The rigid tether experimental paradigm for body-fixed flies corresponding to (C).
Results
Control model of visual and nested mechanosensory feedback during gaze stabilization in fly flight
During flight, flies are often knocked off course by gusts of wind or other external perturbations to their flight paths. Such perturbations () generate optic flow relative to a fly’s own motion, or sensory error (), across the retina that is processed by the brain to generate corrective steering maneuvers of the head () and/or body ()—with the goal to minimize (Figure 2A, Cellini et al., 2022; Cellini and Mongeau, 2020a; Dickinson and Muijres, 2016). Mechanosensory information from externally-generated and/or self-generated body motion—and measured by the halteres—also elicits corrective movements of the head and wings/body (Figure 2A, Hardcastle and Krapp, 2016; Dickinson, 1999; Sherman and Dickinson, 2003; Hengstenberg, 1988; Sandeman, 1980; Beatus et al., 2015). This suite of multisensory reflexes keeps gaze level and appears essential for flight.
We developed a control model of gaze stabilization about the vertical (yaw) axis to model the flow of visual and mechanosensory information in driving head and body motor responses in flies (Figure 2A). We based our framework on prior models in flies that demonstrated that inputs from the visual system and halteres sum in the nervous system (Sherman and Dickinson, 2004), but go further by including naturalistic closed-loop feedback and mechanics. For the head and body, we modeled the distinct contributions of sensory feedback by separating the neural control of gaze stabilization into two sub-components, one for visual feedback ( and ) and the other for mechanosensory feedback ( and ). Critically, we assumed that visual feedback has a gain of –1—because motion in one direction generates equal and opposite optic flow—and that that the mechanosensory neural controllers ( and ) only receive haltere inputs due to body motion. The other functions of the halteres related to structuring the timing of motor output are assumed to be contained within the dynamics of the visual controllers (Dickerson et al., 2019). The separate neural controller pairs for the head and body in our model ensured that any differences in tuning between the head and body were considered. We assumed approximately linear time-invariant (LTI) dynamics (Aström et al., 2010), which is supported by experimental data from prior work (Cellini and Mongeau, 2020a; Cellini et al., 2022). Finally, our model only considered mechanosensory feedback generated from self-generated body motion, asserting that mechanosensory feedback is nested within visual feedback (Figure 2A).
We first modeled the head response by defining the transforms mapping visual and mechanosensory inputs to head motor responses:
represents the visual transform from to , and represents the mechanosensory transform from to . These transforms consist of the multiplication of the corresponding neural circuits associated with the visual () and mechanosensory () control centers of the fly brain and the passive biomechanics of the head-neck system (). We assume that the dynamics of the sensory systems—visual system and halteres—are contained within the dynamics of and , respectively. All the transforms and signals in our model are designated as non-parametric complex-valued functions (see Materials and methods). Throughout, we omit the complex variable for brevity.
Using the transforms Equation 1 and Equation 2, we derived an expression for the closed-loop head motor response as a function of an external visual perturbation and body motion (see Materials and methods for more detailed derivations). We first defined the head response as the sum of visual and mechanosensory inputs due to body motion:
where the sensory error is equivalent to the visual perturbation subtracted by the fly’s gaze (sum of and ):
Substituting Equation 4 into Equation 3 and solving for yields the expression for the closed-loop head response:
Notably, the closed-loop head response of body-free flies (Equation 5) is mediated by three sources of sensory feedback: (1) visual feedback from head movements themselves, (2) visual feedback from body motion and (3) nested mechanosensory feedback from body motion. Conversely, the head response of body-fixed flies can be represented as:
where all terms associated with body motion are set to zero ( in Equation 5), leaving only visual feedback from head movements (Figure 2C). We recognize that haltere neural inputs are always present—even when there is no body motion—and can not be completely abolished without removing the halteres. However these inputs are involved with structuring the timing of motor output, not with the encoding of body velocity via sensing gyroscopic forces (Dickerson et al., 2019; Fayyazuddin and Dickinson, 1996; Fayyazuddin and Dickinson, 1999). Thus, we lump this tonic function of the halteres into the visual controllers ( and ), which are present in both body-free and body-fixed flies. Crucially, our control model mathematically predicts that the head motor responses of body-free (Equation 5) and body-fixed (Equation 6) flies will be distinct due to differences in sensory feedback. Therefore, comparing how the head responses of body-free and body-fixed flies differ provides insights into the distinct sensory modalities that influence head control.
Sensory feedback generated from body movements alters the magnitude, timing, and performance of head responses
Our control model predicted that body-free and body-fixed flies should have distinct head motor responses to the same visual perturbation due to the absence of body-generated visual and mechanosensory feedback. To determine whether empirical data supported this prediction, we employed two experimental paradigms: (1) a magnetic tether where flies were tethered to a pin and suspended between two magnets, allowing free, closed-loop body rotation about the vertical (yaw) axis (Figure 2B) and (2) a rigid tether where flies were fixed in place, thus opening body-generated visual and mechanosensory feedback loops (Figure 2D).
We presented flies in both paradigms with visual perturbations consisting of single sinusoids with frequencies spanning 0.7–10.6 Hz to reveal how body-generated feedback influences head motor responses (see Materials and methods). We began by quantifying the body response of body-free flies to understand what frequency range body-generated feedback should have the most impact on head responses. The body was strongly tuned to low frequencies, similar to a low-pass filter (Figure 3A, red, Figure 3—video 1–Figure 3—video 2, Cellini et al., 2022). Thus, body visual feedback reduced the optic flow—or sensory error ()—entering the visual system, at low frequencies especially (Figure 3A–B, yellow, Figure 4A, red). This result, combined with our control model, predicted that any differences between head responses in body-free and body-fixed flies should be the greatest at low frequencies. Because biological systems often exhibit nonlinear behavior to different types of sensory inputs, we also measured fly responses to sum-of-sines visual perturbations as a check for linearity. Although flies responses varied slightly between single-sine and sum-of-sine perturbations, the overall behavior was similar, indicating that our findings are generalizable (Figure 3—figure supplement 1A-B, Figure 3—video 4).

Sensory feedback generated from body movements alters the magnitude, timing, and performance of head responses.
(A) The body response (red) of body-free flies to single-sine visual perturbations (grey) with varying frequency. The x-axis is normalized to show four oscillations at each frequency. Note that the body response is larger relative to the visual perturbation at low frequencies, leading to a smaller sensory error signal (yellow) in the head reference frame. Thick lines: mean. Thin lines: individual fly means. (B) The distribution of compensation errors in the head reference frame corresponding to the sensory error in (A) normalized by the perturbation amplitude. Values below one indicate that body movements reduced the sensory error while values greater than one indicate that body movements increased the sensory error. (C) The head response of body-free (blue) and body-fixed (violet) flies to the same visual perturbation (grey) shown in (A). At low frequencies, the head would often run into the anatomical limits of the neck joint (dashed pink lines). Thick lines: mean response. Thin lines: individual fly means. (D) The total distribution of head angular displacements for body-free and body-fixed flies for each perturbation frequency. For each frequency, the body-free and body-fixed head distributions had a different variance (F-test, ). Body-free: flies, Body-fixed: flies.

Visual and mechanosensory feedback mediate head control.
The legend indicates whether the data corresponds to the head or body, , the sources of sensory feedback, and the relevant experiment (or prediction). (A) The closed-loop transform from the visual perturbation to the body response . Note that the body is primarily tuned to low frequencies. (B) The closed-loop transform from the visual perturbation to the head response for different sensory feedback conditions. The head transform measured in body-free flies (blue) contains all three sources of feedback (see Equation 3), while the head transform measured in body-fixed flies (purple) contains only head visual feedback (see Equation 6). (C) The predicted (dashed line) transform for the head response with head and body visual feedback (copper, see Equation 7, corresponding to Figure 4—figure supplement 2A) and the experimentally measured equivalent from a ’replay’ experiment (grey, corresponding to Figure 4—figure supplement 2C-E). The highest two frequencies were omitted in the replay experiment due to limitations of our flight arena display system (see Materials and methods). (D) The predicted (dashed line) transform for the response with head visual feedback and body mechanosensory feedback (cyan, see Equation 8, corresponding to Figure 4—figure supplement 2B). Body-free: flies, Body-fixed: flies, Body-fixed replay: flies. For all panels, shaded regions: ±1 STD. Also see Figure 4—figure supplement 3 for all plots overlaid to facilitate comparison across groups.
Next, we quantified the head responses of body-free and body-fixed flies. Consistent with prior work, body-free flies generated head movements that were inversely tuned to the body and resembled a high-pass filter, where the head operated with the largest gains at high frequencies (Figure 3C, blue)(Cellini et al., 2022). Consistent with the prediction of our model (Equation 6), the head response of body-fixed flies was appreciably different from that of body-free flies (Figure 3C, Figure 3—figure supplement 1A-B, Figure 3—video 1–Figure 3—video 2, Cellini and Mongeau, 2020a). Specifically, body-fixed flies moved their head with larger magnitude than body-free flies (Figure 3C). Interestingly, head movements in body-fixed flies were often driven to the anatomical limits of the neck joint (approximately ±15°), which was never the case for body-free flies (Figure 3C). This led to head trajectories that were saturated, meaning that the head could possibly have moved with larger amplitude if it were anatomically possible. The total distributions of head angular displacements were likewise significantly different between body-free and body-fixed flies (F-test, for every frequency) (Figure 3D, Figure 3—figure supplement 1C). Body-free flies rarely moved their head more than 5° from the neutral position (0°), whereas body-fixed flies regularly moved their head in excess of 10° (Figure 3D, Figure 3—figure supplement 1C). This was especially prominent at lower frequencies, while head responses at higher frequencies were closer in magnitude (Figure 3C, Figure 3—figure supplement 1A, B). This is consistent with the low-frequency tuning of body movements, where the smaller sensory error in body-free flies at low frequencies likely led to the smaller head motor responses. Body visual feedback also altered the phase, or the timing, of the sensory error signal entering the visual system, leading to differences in the timing of head motor responses in body-free and body-fixed flies (Figure 3A, C).
To quantify the performance of head responses, we measured the closed-loop transforms from to in body-free and body-fixed flies. These transforms mirror Equation 5 and Equation 6, respectively, but normalize the head response with respect to . While these transforms are complex valued functions, we represented them graphically via gain, phase, and compensation error (Figure 4B). Gain represents the magnitude of the ratio of the perturbation and head phase represents the difference in timing , and compensation error describes the normalized magnitude of the sensory error signal . A compensation error value of zero indicates ideal performance, a value between zero and one indicates intermediate performance, a value of one indicates that the head response has no effect on performance, and a value greater than one indicates a deleterious response (see Materials and methods). Body-fixed flies operated with overall higher gain than body-free flies (Figure 4B, Figure 4—figure supplement 1A, purple vs blue). Both body-free and body-fixed flies displayed a phase lead (phase >0) at low-frequencies which decreased with increasing frequency, however this was less pronounced in body-fixed flies (Figure 4B, Figure 4—figure supplement 1A). The larger gain and smaller phase lead in body-fixed flies led to improved performance at low-frequencies, but worse performance at higher frequencies, illustrating that body-fixation leads to tradeoffs in head stabilization performance (Figure 4A, Figure 4—figure supplement 1A, compensation error). Altogether, the magnitude, timing, and performance of head motor responses were distinct between body-fixed and body-free flies, demonstrating the critical role of sensory feedback in shaping head movements.
Visual feedback changes the tuning of head responses across visual motion frequencies
Body-free and body-fixed flies clearly exhibit distinct head responses (Figure 3), but what are the individual contributions of visual and nested mechanosensory feedback underlying these differences? To address this question, we used our mathematical model of gaze stabilization (Equation 5) combined with behavioral measurements to predict how visual and mechanosensory feedback individually influence head control.
First, we measured the visual transform , which is the transform between the sensory error and the head motor response in body-fixed flies. We then substituted (see Figure 5C for visualization of ) into (Equation 5) while setting (indicating no contributions of mechanosensory feedback due to body motion) to predict the effects of body visual feedback on the head motor response (Figure 4—figure supplement 2A):

Nested mechanosensory feedback damps head movements.
(A) The control diagram of the sensory error to head transform in body-free flies. Note that this transform includes nested mechanosensory feedback from body motion. (B) The control diagram of the sensory error to head transform in body-fixed flies, which is simply the visual transform . (C) The gain and phase of the to transform for body-free (blue) and body-fixed (purple) flies. Shaded regions: ±1 STD (D) The ratio of the to transform in body-free and body-fixed flies (). If nested mechanosensory feedback from body motion had no effect, we would expect this ratio to have a gain of one and phase of 0 (dashed blue lines). The empirical data has a gain less than one, indicating the head movements are damped by nested mechanosensory feedback. Shaded regions: ±1 STD. Body-free: flies, Body-fixed: flies.
Using Equation 7, we generated a prediction of the closed-loop head transform with body visual feedback. Body visual feedback could partially account for the decrease in magnitude and the increase in phase in body-free flies (Figure 4C, copper). Notably, head gain at low frequencies shifted from higher gain in body-fixed flies (Figure 4B, purple) to lower gain when body visual feedback was introduced (Figure 4C, copper), which more closely matched the body-free head response (Figure 4B, blue). These visually mediated changes in the head response closely followed the inverse of the body compensation error (Figure 4A, red), meaning that the more the body reduced sensory error, the smaller the head magnitude became in body-free flies. These results demonstrate that body visual feedback changes the tuning of head responses from broadband (high gain at all frequencies) to high-pass (high gain only at higher frequencies) (Figure 4C, copper vs. purple). Interestingly, body visual feedback could not account for the overall decrease in head gain in body-free flies (Figure 4A, copper vs blue; see Figure 4—figure supplement 3 for both plots overlaid), suggesting that other sensory modalities must influence head control.
We confirmed our prediction from Equation 7 by performing a ‘replay’ experiment, wherein we designed a new visual perturbation for body-fixed flies that had the mean body response of body-free flies subtracted (Figure 4—figure supplement 2C-E). In this way, we experimentally reintroduced body visual feedback in body-fixed flies. Due to limitations in spatial resolution of our visual display, we could not properly replay body motion at the two highest frequencies, thus we exclude them. The match between our model prediction (Figure 4C, copper) and experimental data (Figure 4C, grey) strongly supports the notion that body visual feedback accounts for the change in tuning across visual motion frequencies––but not the overall decrease in magnitude––of head responses between body-free and body-fixed flies. Furthermore, the close match between model and experiments provides some assurance that the head control system can be modeled with LTI assumptions, thus supporting our LTI-based control theoretic framework.
Nested mechanosensory feedback damps head movements
If body visual feedback alone cannot fully account for the difference in head motor responses, then it would follow that mechanosensory feedback due to body motion plays a role in shaping head movements. Body visual feedback predicts that body-free flies should have larger head responses at the highest frequency (Figure 4C) because body movements increase the sensory error in this range (Figure 4A, red, compensation error >1), but this was not observed in our experiments, suggesting that there are other sensory modalities at play (Figure 3C–D, Figure 4B). Prior work showed that flying flies mounted on a motor and rotated about the vertical axis perform compensatory head movements in the opposite direction of the body, even when visual feedback is removed, pointing to the role of haltere-generated mechanosensory feedback (due to body motion) in head control (Sandeman, 1980). However, the individual contributions of visual and mechanosensory feedback remain unclear in the control of head movement, particularly due to their nested architecture.
To estimate the contributions of mechanosensory feedback on head control, we compared the transform from to in body-free (Figure 5A) and body-fixed (Figure 5B) flies. In body-fixed flies this transform is purely mediated by visual inputs and equal to , but in body-free flies there is nested mechanosensory feedback that could shape the head response. We defined the to transform in body-free flies as . We discovered that the gain of was substantially lower, and the phase subtly larger, than , suggesting that nested mechanosensory feedback due to body motion has a transformative influence on head control (Figure 5C). By computing the ratio / we discovered that the overall gain from to decreased by a factor of ~0.4 and phase increased by ~20° (Figure 5D, Figure 4—figure supplement 1B). This ratio describes the effective weighting of nested mechanosensory feedback with respect to visual feedback. These results strongly suggest that the neural signals generated from mechanosensory pathways serve to actively damp head movements. Although this analysis does not isolate the precise sensory mechanism (halteres, antenna, wing proprioceptors, etc.), our findings strongly suggest that the observed change in head control is driven by a mechanosensory modality that measures body motion, thus strongly implicating halteres (Dickinson, 1999; Sherman and Dickinson, 2003).
Similar to how we could predict the contributions of body visual feedback on the closed-loop head response, we used Equation 5 to generate a mathematical prediction of the head response with body mechanosensory feedback (Figure 4B):
However, it was not possible to measure the mechanosensory transform directly from our experimental data, because visual feedback was always present. Therefore, we derived an expression equivalent to Equation 8 with in place of (see Materials and methods):
Our prediction of the head response with body mechanosensory feedback due to body motion also partially accounted for the increase in head movement magnitude and decrease in phase in body-fixed flies, but similarly to body visual feedback, could not fully account for the difference (Figure 4D, Figure 4—figure supplement 1A, cyan vs blue). Notably, mechanosensory feedback due to body motion led to an overall decrease in head gain across visual motion frequencies in body-free flies (Figure 4D, Figure 4—figure supplement 1A), whereas visual feedback primarily attenuated low frequency visual motion and changed the tuning of the head response from broadband to high-pass (Figure 4C, Figure 4—figure supplement 1A). While prior experiments in body-fixed flies mounted on a motor showed that mechanosensory information from the halteres primarily mediates high-frequency steering responses (Sherman and Dickinson, 2003), our results strongly suggest that haltere feedback due to body motion has a considerable influence even at lower frequencies. This emergent low-frequency response is likely a property of closed-loop dynamics (due to the body being free to move) that would not be evident in open-loop (body-fixed) conditions. Altogether, out results reveal the precise roles of body-generated visual and mechanosensory feedback in shaping head movement: visual feedback changes the tuning from broad-band to high-pass and mechanosensory feedback reduces the overall magnitude.
Head damping is present during self-generated but not externally generated body motion
Our findings strongly suggest that the change in the transform from to is primarily brought about by a mechanosensory pathway from to (). In body-free flies, where , this pathway shapes head responses via nested sensory feedback (Figure 5). Although it was impossible to measure directly in body-free or body-fixed flies because head visual feedback was always present, our control framework (Figure 2A) allowed us to estimate and make a prediction of how nested mechanosensory feedback influences head control. We solved for from Equation 5:
which can equivalently be represented as:
Our prediction of exhibited low gain at low frequencies, but gain swiftly increased with increasing frequency, consistent with previous work that showed that the halteres are most sensitive to high body frequencies/angular velocities in open-loop (Figure 6A, pink) (Dickinson, 1999; Sherman and Dickinson, 2003). The shape of resembled a high-pass filter, suggesting that the halteres may be primarily tuned to angular acceleration in open-loop (Sandeman, 1980). This result likely explains why prior studies in body-fixed (open-loop) flies reported that the halteres encode body velocity like a high-pass filter—they were measuring without self-generated sensory feedback (Sherman and Dickinson, 2003). The high gain may seem counter-intuitive—seeing as we argue that mechanosensory feedback actually decreases the magnitude of head movements (Figure 4D)—however the phase response between –70° and –200° (average –119°) means that the head steering response elicited by mechanosensory feedback destructively interferes (opposite direction) with the visually elicited head steering response, leading to an overall decrease in the magnitude of head movements in body-free flies. An alternate interpretation of these data is that mechanosensory information is subtracted from, rather than added to (as in Figure 3A), visual information in the nervous system (i.e., negative feedback).

Head damping is present during self-generated but not externally generated body motion.
(A) The predicted transforms from body mechanosensory information to the head response for self-generated body motion (pink) and externally generated body motion (blue). Shaded regions: ±1 STD. (B) The control framework outlining how externally generated body motion influences head response via mechanosensory feedback. Note that head visual feedback is still present even if there is no external visual perturbation since the head is free to move. (C) Experimental equivalent to (B). Flies were mounted to the shaft of a stepper motor and the body motion measured in body-free flies was replayed on the motor. Note that the visual display was also mounted to the motor shaft, effectively removing body visual feedback, while leaving mechanosensory feedback intact. (D) The head response (blue) of flies during the experiment where body motion (red) was replayed on the motor. Thin blue lines show the response of individual flies. Thick grey line shows the mean passive head response of an anesthetized fly to the same replayed body motion. Also see Figure 6—figure supplement 1. (E) Coherence for the visual transform from to in body-free (blue) and body-fixed (violet) flies compared to the mechanosensory transform from to measured from the motor experiment. Note that the mechanosensory transform has much lower coherence, indicative of an uncoordinated response. Shaded regions: ±1 STD. (F) The distribution of all active head displacements (blue) compared to the distribution of all passive head displacements from the motor experiment (grey). Motor experiments: flies.
An interesting idea to consider is that the damping due to mechanosensory feedback we uncovered is only present during self-generated (i.e., nested) rather than externally generated body motion (e.g. from a gust of wind). The yaw flight axis is inherently stable—as opposed to pitch—so flies may only require mechanosensory feedback during self-generated yaw turns, where flies need to damp out their own motion (Taha et al., 2020; Faruque and Sean Humbert, 2010a; Faruque and Sean Humbert, 2010b). Although this idea mainly applies to the control of body movements, the head may be controlled similarly. To this end, we designed an experiment where we imposed externally generated body motion with no body visual feedback to uncover for externally generated, rather than self-generated body motion (Figure 6B). We mounted rigidly tethered flies to the shaft of a stepper motor and replayed the recorded body motion of body-free flies , while measuring their head responses (Figure 6C). Crucially, the visual display was also fixed to the motor shaft, so as to remove any visual feedback generated from body motion.
Intriguingly, the head response to externally generated body motion was small and generally uncoordinated with body motion (Figure 6D, Figure 3—video 4–Figure 6—video 2). We computed the coherence—a measure of linear correlation in frequency domain where values near one indicate high correlation and values near zero indicate low correlation—between the externally generated body motion and the head response and found that the head operated with a coherence of ~0.5. Compared to the head responses driven by visual motion—which operated with coherence near 1—our results demonstrate that flies do not have a robust head response to externally generated body motion about the yaw axis (Figure 6C), corroborating previous work that measured wing movements (Sherman and Dickinson, 2003). We computed from these experiments using our control framework:
and compared the response to computed for self-generated body motion (Equation 10–Equation 11). We discovered that for externally generated body motion displayed gains nearly an order of magnitude smaller than for self-generated body motion (Figure 6A). The phase estimates were highly variable due to the low-coherence response (Figure 6A). These findings strongly suggest that mechanosensory information is integrated in a nonlinear fashion, that is dependent on the type of body motion: externally- vs self-generated. The precise mechanism that underlies the gating is unclear, although it is likely that self-generated turns evoke mechanosensory-dependent activity of muscles within the neck motor system (Huston and Krapp, 2009).
Mechanical properties of the neck joint prevent passively generated head motion
To ensure that the head responses we measured in flies mounted on the motor and in the magnetic tether (where the body also moves the same way) were elicited by sensory feedback—not generated mechanically from body motion—we repeated the same experiment illustrated in Figure 6C, but for anesthetized flies. This approach allowed us to isolate any passively generated head movements due to body motion that were not under active neural control. We found that passively generated head movements were much smaller than head movements of actively flying flies (Figure 6D, grey vs blue, Figure 6—videos 4–6). The head rarely moved more than 0.5° in anesthetized flies, compared to 2° in active flies, demonstrating that sensory feedback is the primary driver of head movements (Figure 6F). This was consistent for a sum-of-sines replay experiment (Figure 6—figure supplement 1). Interestingly, the passive mechanics of the neck joint (stiffness, damping, etc.) effectively decoupled the head from the body, which could simplify the neural control of head movements because flies would not have to account for passive head motion (Cellini et al., 2021). Neck passive mechanics could also help keep the head stable during rapid turns or large external perturbations such as turbulent gusts of wind.
Head saccades are actively damped by mechanosensory feedback
Our results thus far strongly implicate mechanosensory feedback due to body motion in damping smooth head responses during self-generated, but not externally generated turns. However, in addition to the smooth head and body movements during gaze stabilization, flies also perform rapid, saccadic turns of the head and body (Cellini and Mongeau, 2020b; Cellini et al., 2021; Mongeau and Frye, 2017; Collett and Land, 1975; Bender and Dickinson, 2006b; Muijres et al., 2015). Prior studies suggest that, once executed, body saccades are visually open-loop, as body saccade duration is on the same order as visuomotor delays and altering visual feedback during body saccades does not change their dynamics (Bender and Dickinson, 2006a; Mongeau and Frye, 2017). However, mechanosensory feedback is thought to play a role in eliciting the wing-braking response to terminate body saccades (Cellini and Mongeau, 2020b; Bender and Dickinson, 2006a). Head saccades are thought to be similarly visually open-loop (Kim et al., 2017; Cellini et al., 2021). However, as prior work has shown that visual and mechanosensory inputs converge at the neck motor center (Milde et al., 1987; Strausfeld and Seyan, 1985), we hypothesized that mechanosensory feedback due to body motion also influences head saccade dynamics. Specifically, due to the damping effects of mechanosensory feedback we uncovered during self-generated body motion, we predicted that head saccades in body-free flies should be of smaller magnitude than in body-fixed flies.
To test this hypothesis, we culled head saccades from body-free and body-fixed flies presented with a static visual stimulus using a previously described method (Figure 7A–B, Figure 7—video 1, Cellini et al., 2021; Mongeau and Frye, 2017; Salem et al., 2020). Consistent with our prediction, head saccades in body-free flies displayed smaller amplitude and peak velocity than head saccades in body-fixed flies, suggesting that mechanosensory feedback damps head saccades (Figure 7C–D), as it does for whole-body saccades (Bender and Dickinson, 2006a). Interestingly, head saccades in body-free flies were also immediately followed by a head movement that returned the head to the neutral position (Figure 7C). However, this return head movement was absent, or much slower, in body-fixed flies, suggesting that mechanosensory feedback plays an important role in terminating, or braking, head saccades (Figure 7C). By fitting a decaying exponential (total damping time constant) to the head trajectory immediately after the head saccade responses, we discovered that body-fixed flies took ~8 times longer to return to baseline than body-free flies (Wilcoxon rank sum, p < 0.001) (Figure 7D). Interestingly, during the return head movement in body-free flies, the body was still in motion (Figure 7C), suggesting that body-generated feedback, or lack thereof, is the mechanism driving this difference in behavior. Because visual sensory feedback has little effect on saccade dynamics (Bender and Dickinson, 2006a), this damping of head saccades is likely driven by nested mechanosensory feedback—although some degree of passive (mechanical) damping is likely present as well (Cellini et al., 2021). We found that head saccades performed in dark visual conditions followed similar trajectories, supporting the notion that mechanosensory, not visual, feedback mediates head saccade damping (Figure 7—figure supplement 1). Intriguingly, the decrease in damping in body-fixed flies could also explain why wing saccades last upwards of 500ms in body-fixed flies, while body saccades in free flight typically last only 50–100ms (Cellini and Mongeau, 2020b). While prior work has demonstrated that local haltere and wing muscle proprioceptive feedback feedback influence visuomotor gain (Kathman and Fox, 2019; Mureli and Fox, 2015; Mureli et al., 2017; Bartussek and Lehmann, 2016; Lehmann and Bartussek, 2017), it is unlikely that this mechanism could explain the attenuated saccade dynamics, due to saccades being visually open-loop (Bender and Dickinson, 2006a). Overall, our findings strongly suggest that nested mechanosensory feedback has a significant influence on the control of both smooth head movements and head saccades in flies.

Head saccades are actively damped by mechanosensory feedback.
(A) Example body (red) and head (blue) trajectories for a body-free fly in the magnetic tether presented with a static visual stimulus. Rapid flight turns called saccades are highlighted. Note that head saccades are followed by a head movement that returns the head to the center position. Also see Figure 7—video 1. (B) Same as A) but for a body-fixed fly. Head movements are shown in purple. Note that head saccades are not followed by a return head movement. (C) Left y-axis: averaged head saccade displacement (top) and velocity (bottom) for body-free and body-fixed flies. Right y-axis: averaged body saccade displacement (top) and velocity (bottom). Note that saccades typically last less than 200ms (bold portion of head and body trajectories indicate saccades), but an extra second of data is shown to illustrate the difference between the body-free and body-fixed head movements after a saccade. Inset shows the first 200ms of head and body trajectories. Shaded regions: ±1 STD. (D) Distributions of head saccade amplitude, peak velocity, duration, and damping time constant. The damping time constant was computed by fitting a decaying exponential to the head response directly after a saccade. ***Wilcoxon rank sum and t-test, p < 0.001. Body-free: flies, saccades, Body-fixed: flies, saccades. (E) Proposed neural architecture for haltere-related damping of head movements for self-generated vs. externally-generated body motion. When body motion is self-generated, head and body motor commands are sent in parallel with an efferent signal, effectively closing a gate that allows mechanosensory feedback due to body motion to damp head movements. When body motion is externally generated, this gate is open and body motion has little effect on head movements (Figure 6A and (D).
Discussion
We developed a mathematical model of gaze stabilization that accounted for the role of visual feedback and nested mechanosensory feedback in mediating head responses in flies (Figure 2). Our model predicted differences in head responses between body-free and body-fixed flies based on changes in sensory feedback, which we confirmed with experimental data (Figure 3). We revealed that visual feedback influenced the frequency tuning of head movements, whereas nested mechanosensory feedback due to body motion reduced the overall magnitude of head responses during smooth movements and saccades via active damping (Figures 4—7). By comparing head responses during self-generated and externally generated body motion, we uncovered a nonlinear gating of body-generated mechanosensory feedback on head movements influenced by self-motion. Overall, our findings unravel multisensory integration within nested sensory feedback loops in insect flight. We provide a framework amenable to study nested biological feedback loops across phyla.
Change in head movements between body-free and body-fixed flies is an emergent property of reflexive feedback
We discovered that body-fixed flies exhibited exaggerated head movements compared to body-free flies, which mirrors their exaggerated wing movements (Fry et al., 2005). At face value, it might appear that body-fixed flies are adapting to the lack of stabilizing body movements by moving their head with larger magnitude. However, such a mechanism implies that flies must learn and change their neural controller ( or in Figure 2A–B), to compensate for body fixation. Instead, our results support the notion that visual feedback underlies these changes and enables flies to partially compensate for body fixation. In other words, the larger head movements observed in body-fixed flies are an emergent property of reflexive feedback. In essence, the increase in sensory error due to body fixation (Figure 3) elicits a larger head motor response immediately, without requiring flies to learn a new neural controller. In this way, flies have some degree of built-in redundancy in their gaze stabilization system. An emergent property of this type of system is robustness to changes in the dynamics of one of the ‘motors’ (head or body). For example, wing damage is a common injury experienced by insects which could impair their ability to stabilize gaze via body movements during flight (Rajabi et al., 2020). The change in visual feedback following wing damage could enable insects to rapidly compensate with their head, rather than learn how to stabilize gaze with a damaged wing (Muijres et al., 2017). Indeed, we show that the head’s performance improves at low frequencies when the body is fixed (Figure 5B–C). This idea is further supported by behavioral evidence in primates, where eye movements in monkeys increase in magnitude to compensate for sudden head fixation with no detectable change in gaze (head +eye) velocity (Bizzi, 1981; Lanman et al., 1978). Tuned sensory feedback can be preferable to learning because sensory feedback acts on the order of neural latency and does not require animals to learn new strategies. However, sensory feedback and learning are not mutually exclusive and flies likely exhibit both (Wolf et al., 1992).
Nested mechanosensory feedback actively damps head movements
Mechanosensory feedback plays an important role in flight stability. Flies with bilateral haltere ablations or immobilized halteres cannot achieve stable flight (even in magnetically tethered assays), directly implicating mechanosensory feedback from the halteres in flight stabilization (Ristroph et al., 2013). For the body, it appears that the low (~5ms) sensory delays associated with the halteres act synergistically with the slower (~30ms) visual system (Sherman and Dickinson, 2003; Nakahira et al., 2021), thereby permitting larger visual gains (Elzinga et al., 2012). In hawk moths, mechanosensory feedback from the antennae is nested within visual feedback during flower tracking, which is critical for high-frequency performance (Dahake et al., 2018). This suggests that the structure of visuo-mechanosensory integration may be a preserved feature of insect flight. Visuo-mechanosensory integration also likely explains why wing responses are exaggerated in body-fixed flight, because mechanosensory feedback is not present to actively damp out steering responses (Elzinga et al., 2012; Taylor et al., 2008). However, the role of mechanosensory feedback and stability is less clear when considering the control of head movements. Indeed, the biomechanics of the head-neck system are inherently stable (Cellini et al., 2022; Cellini et al., 2021), so what is the role of mechanosensory feedback in the head control system?
About the roll and pitch axes, head movements serve to maintain level gaze by offsetting variations in body roll and pitch (Hardcastle and Krapp, 2016; Hengstenberg, 1984). However, we show that this is largely not the case for about the yaw axis during externally generated body movements (Figure 6), suggesting that there is another mechanism at play. We discovered that mechanosensory feedback actively damped head movements in body-free flies (Figure 5, Figure 6), similar to the proposed active damping of wing movements (Elzinga et al., 2012). The active damping of head movements decreased head excursions, and occurrences of head saturation were reduced to near zero (Figure 3C and D). An interesting possibility is that mechanosensory feedback from body movements may act as a centering reflex to keep the head aligned relative to the thorax and thus prevent the head from reaching large angular excursions. Indeed, gaze stabilization quickly degrades as the head reaches its anatomical limits (Cellini et al., 2021). Another potential explanation is that the effects of mechanosensory feedback on head control are simply a result of coupled neural pathways between body control and head control. The descending sensorimotor pathways associated with the head and body have some overlap, suggesting that similar information—such as damping commands from the halteres—could be shared between the head and body (Namiki et al., 2018). Revealing the precise role of mechanosensory feedback will require analyses of the neural pathways associated with head and body control.
Nested proprioception across phyla
Our work in flies shows that sensing body motion via mechanosensory feedback has a transformative influence on a task that is driven by visual inputs (optomotor response). But similar proprioceptive mechanisms exist across phyla, and nested proprioception is likely a prevalent feature of animal locomotion. Many locomotor tasks in which an animal’s whole body moves in response to some external stimuli—such as flower tracking in moths (Sponberg et al., 2015; Roth et al., 2016), refuge tracking in fish (Roth et al., 2011; Uyanik et al., 2020), and wall following in cockroaches (Cowan et al., 2006; Mongeau et al., 2015)—likely involve proprioceptive feedback. Our framework could be applied to tease apart the role of proprioceptive mechanisms—such as the role of antennae, the vestibular system, or other mechanosensors—in task-level control. For instance, flies appear to use their antenna to damp out their visually guided groundspeed controller in a nested fashion (Fuller et al., 2014a). A comparable experiment in mice or fish, where vestibular feedback from the the inner-ear is abolished (via chemical labyrinthectomy) could provide insights into how proprioception shapes locomotion in vertebrates (Ito et al., 2019). Altogether, our framework is generalizable for teasing out the role of nested proprioception in a range of animal behaviors.
Distinguishing between self-generated and externally generated body motion
Mechanosensory feedback due to body motion had little influence on head movements when body motion was externally generated as opposed to self-generated (Figure 6). For a LTI system, one would expect the same sensory inputs to lead to the same outputs. However, flies did not follow this expectation, suggesting that motor-related signals or visual feedback gate (non-linearly) mechanosensory feedback. We propose a model in which self-generated head/wing steering commands are sent in parallel with a signal that opens a gate to allow mechanosensory information to flow to the neck motor center (Figure 7E). One possible mechanism at the neural level is that flies actively modulate gyroscopic sensing via haltere steering muscles.
Recent work confirmed that the haltere muscles are actively modulated by visual inputs during flight (Dickerson, 2020). The "control-loop" hypothesis—originally proposed by Chan et al., 1998—suggests that visual inputs modulate haltere muscle activity, which then regulate mechanosensory feedback by recruiting haltere campaniform sensilla (Chan et al., 1998; Dickerson et al., 2019). One possibility is that visual inputs could modulate haltere muscle activity and increase the magnitude of gyroscopic inputs, thus leading to damped head dynamics. Due to the body-fixed assays required for electrophysiology, it has not been impossible to determine whether gyroscopic inputs are modulated by visual inputs, but our results suggest that this could be the case. Altogether, our findings provide an extension of the control-loop hypothesis for the more specific case of gyroscopic sensing, that distinguishes between motor context (self-generated vs. externally generated body motion).
We cannot discount other mechanisms—such as haltere afferents gating subpopulations of neck motor neurons’ responses to visual stimuli—as the integration of visual and mechanosensory information is often nonlinear in insect flight (Sherman and Dickinson, 2004; Huston and Krapp, 2009; Haag et al., 2010; Kathman and Fox, 2019). Alternatively, gating may be modulated by an efference copy during self-generated turning maneuvers. These two hypotheses could not be teased apart here because flies mounted to a motor almost immediately stop flight if visual inputs conflict with prescribed motor rotation, that is, it was necessary to mount the visual display to the motor shaft for flies to sustain flight, thereby eliminating retinal slip due to externally generation motion, as in prior work (Sherman and Dickinson, 2003).
Neurophysiological evidence for gating of visual and mechanosensory information
The nonlinear gating we described here corroborates neurophysiological data on the influence of haltere tonic inputs on neck motor neurons. Recordings from a subpopulation of neck motor neurons demonstrated that information from the eyes and halteres is combined nonlinearly (Huston and Krapp, 2009). Specifically, some neck motor neurons do not generate action potentials in response to visual motion alone, but will generate action potentials when the halteres are beating simultaneously and providing tonic inputs. Furthermore, the ventral cervical nerve motoneuron (VCNM) cell—which mediates head control—receives input from visual, haltere, and antennal sensory neurons (Haag et al., 2010). Visual motion alone generates subthreshold activity, but when combined with mechanosensory inputs (antennae or halteres), causes the VCNM to spike. Notably, VCNM integrates a central input reflecting the behavioral state of the fly (flight and non-flight). While the influence of haltere tonic activity on downstream circuits has been characterized, at present it is unclear how gyroscopic inputs from the halteres influence neck motor neurons, primarily due to technical limitations of using fixed neurophysiological preparations. An interesting possibility is that some neck motor neurons, in the presence of gyroscopic feedback, could actively brake head movements thus providing a mechanism for active damping.
Materials and methods
Animal preparation
Request a detailed protocolWe prepared flies according to a previously described protocol (Cellini et al., 2022; Cellini and Mongeau, 2020a). Briefly, we cold-anesthetized 3- to 5-day-old females flies (wild-type Drosophila melanogaster) by cooling them on a Peltier stage maintained at ~4° C.Following cold anesthesia, we fixed stainless steel minutien pins (100 µm diameter, Fine Science Tools, Foster City, CA) to the thorax of each fly using UV-activated glue (XUVG-1, Newall). We fixed the pin at angle of ~30°, consistent with the body’s angle of attack in freely flying flies. We allowed ~1 hr for recovery. For the body-free condition, we suspended each fly between two magnets, allowing free rotation along the yaw (vertical) axis (Figure 2C). The pin was fit into a sapphire bearing which has a coefficient of friction of ~0.1 (Vee jewel bearing, Bird Precision), which flies can readily overcome (Cellini et al., 2022). The inertia of the pin was less than 1% of the fly’s inertia. Further, using an electromagnetic simulation we previously showed that frictional forces due to the pin-bearing interface are about two orders of magnitude smaller than forces generated in flight (Cellini et al., 2022). Thus flies can readily overcome this friction, as previously shown (Mongeau and Frye, 2017). For rigidly tethered (body-fixed) flies, all preparations were the same except we fixed flies to tungsten pins (A-M Systems) which were rigidly held in place (Figure 2D). This is in contrast to previous work that instead rigidly tethered flies with a 90° angle with respect to the pin and angled the pin itself to 30°, although this difference has little effect on the head response (Cellini and Mongeau, 2020a; Cellini et al., 2021).
Flight simulator
Request a detailed protocolThe virtual reality flight simulator illustrated in Figure 2C–D has been described elsewhere (Mongeau and Frye, 2017; Reiser and Dickinson, 2008). The display consists of an array of light emitting diodes (LEDs, each subtending 3.75° on the eye) that wrap around the fly, subtending 360° horizontally and 60° vertically. We recorded the voltage signal output from our flight arena’s visual display with a data acquisition system (Measurement Computing, USB-1208FS-PLUS), which measures the displacement of our prescribed visual perturbation. We used this signal as the input to our model to ensure we accurately quantified what flies were actually seeing during experiments. We placed flies in the center of the arena and provided illumination from below with an array of twelve 940 nm LEDs and two 940 nm LEDs above. Body-free and body-fixed flies were examined using the same flight simulator. We recorded video at 100 with an infrared-sensitive camera placed directly below the fly (Basler acA640–750 µm). We used our custom computer vision software suite, CrazyFly (https://github.com/boc5244/CrazyFly, Cellini, 2021) to analyze body and head kinematics. The tracking algorithms have been described elsewhere (Cellini et al., 2022). We measured the body angular position with respect to a global (flight arena) coordinate frame and head angular position relative to the body in each frame in our recorded videos.
Visual perturbations
Request a detailed protocolWe primarily employed seven previously constructed single-sines visual perturbations at frequencies [0.7, 1, 1.5, 2.1, 3.5, 5.3, 10.6] Hz, designed to elicit robust head and body responses across a broad frequency range (Cellini et al., 2022). The amplitude at each frequency was chosen such that the velocity was normalized to 250° (mean speed of 159°). The perturbations can be represented as:
As in prior work, we also employed three previously constructed sum-of-sines visual perturbations (Cellini et al., 2022). Briefly, each visual perturbation consisted of a 20-s sum-of-sines signal with nine logarithmically spaced frequencies components fi in increments of 0.05 Hz, where no frequency was a prime harmonic of another. The amplitude of each frequency component was chosen such that the velocity of each component was normalized to thee values of [42, 70, 95]° and the phase was randomized. The perturbations can be represented as:
These single-sine and sum-of-sines visual perturbations cover the frequency range of natural scene dynamics that a fly would normally experience in free flight (Kern et al., 2005). All visual perturbations were displayed on our flight simulator as a grating with 30° spatial wavelength. This ensured that our perturbations had a mean temporal frequency of ~5 Hz, near the optimum of the motion vision pathway in Drosophila (Figure 2C–D, Jung et al., 2011).
Non-parametric system identification
Request a detailed protocolUsing a previously described method (Cellini et al., 2022), we applied frequency-domain system identification to determine non-parametric frequency-response functions from behavioral data. For a given input (ex: visual perturbation or sensory error ) and output (ex: head or body ) signal, we aimed to determine the relative magnitude (gain) and timing (phase) in frequency domain. We first detrended and low-pass filtered (cutoff 40 Hz) each signal in time-domain to remove low-frequency drift and high-frequency noise. We then transformed the input and output signals into frequency domain using a Chirp-Z Transform (Remple and Tischler, 2006; Windsor et al., 2014) at frequency points between 0–50 Hz in increments of 0.05 Hz. We divided the resulting complex response of the output signal by the complex response of the input signal, resulting in the frequency-response function describing the transformation between input and output. We made no explicit assumption of linearity in , as flies’ visuomotor responses tend to be marginally nonlinear (Cellini et al., 2022; Cellini and Mongeau, 2020a). However, the high coherence or our transforms (Figure 4—figure supplement 3), the consistent responses between single-sine and sum-of-sines perturbations (Figure 4—figure supplement 3 vs Figure 4—figure supplement 1), and the close match between our replay experiment and theoretical prediction (Figure 4C, Figure 4—figure supplement 3) suggests that linear techniques can still be used with a high degree of confidence.
We extracted the gain and phase by taking the magnitude and angle of the complex response, respectively. We calculated the compensation error for each closed-loop frequency response function by computing the distance between the and the perfect compensation condition (gain = 1, phase = 0°) on the complex plane (Cellini et al., 2022; Sponberg et al., 2015). Compensation error can be expressed as:
We also calculated the coherence of each closed-loop transform using the MATLAB routine mscohere to ensure that head and body movements were sufficiently related to the visual perturbations (Figure 4—figure supplement 3, Figure 4—figure supplement 1).
Wherever there was saturation in the head response for body-fixed flies (as in Figure 3C, top), we applied saturation-corrected least-squares-spectral-analysis (LSSA) (Figure 3—figure supplement 2). Briefly, we removed the saturated portion of the data (where velocity was near zero) and fit a sine wave to the remaining un-saturated data (Figure 3—figure supplement 2). Then we corrected the gain of any transforms affected by the saturation ( and for body-fixed flies) (Figure 3—figure supplement 2A). To confirm that LSSA itself was not changing our results, we compared all transforms for the Chirp-Z transform and LSSA (without saturation correction) methods. Both methods yielded virtually identical results (Figure 3—figure supplement 2B). Only the lowest two frequencies showed any difference in gain after the saturation correction routine and phase was unaffected across all frequencies and all methods (Figure 3—figure supplement 2B).
Uncertainty propagation in frequency response functions
Request a detailed protocolExperimentally measured frequency-response functions—such as the transforms between and in body-free and body-fixed flies shown in Figure 4 and the to transforms in Figure 5 —were measured for each fly and the mean and standard deviation of the gain, phase, and compensation error were calculated across flies (using circular statistics for phase Berens, 2009). However, we were not able to apply the same statistical framework to estimate confidence intervals when computing mathematical predictions, such as in Equation 7, because our derived equations combined data sets with different groups of flies. For example, was measured in body-fixed flies, but in body-free flies. Therefore, we estimated confidence intervals using a propagation of uncertainly analysis as described in prior work (Roth et al., 2016).
Stepper motor experiments
Request a detailed protocolFlies were rigidly tethered to the shaft of a stepper motor (Nema 17) (Figure 6C). The stepper motor was controlled with a motor driver (TB6600) with a resolution of 0.225° per step, thus providing smooth motion of the body. We controlled the motor by sending step and direction signals to the driver from a DAQ. We printed a black and white grating with 30° spatial wavelength (matching the grating displayed on our flight simulator) on standard A4 paper. We fixed the grating in a circular pattern to the motor shaft using a custom 3D printed part (Figure 6C). This ensured that any rotations of the motor—and thus the fly’s body—did not induce any visual feedback. We replayed the mean body motion measured from actively flying flies in the magnetic tether (Figure 3A, red) on the motor and measured the corresponding head response for actively flying flies (Figure 6D, blue) and anesthetized flies (Figure 6D, grey). We used triethylamine (commercially available as FlyNap, Carolina Biological Supply) to anesthetize flies. Also see Figure 6—figure supplement 1 for the passive head response to a sum-of-sines perturbation.
Control framework and derivation of closed-loop head responses
Request a detailed protocolWe synthesized our control framework based on previous work on the control of head and body movements in flies, where head and body velocity are the state variables (Cellini et al., 2022). However, we made all computations based on head and body displacements, as taking the derivative (i.e. computing the velocity) of complex signals does not change the mathematical relationship between such signals (i.e. gain and phase stay the same). Furthermore, numerical differentiation typically amplifies noise (van Breugel et al., 2020). When computing complex valued transforms (e.g., ), we calculated the gain and phase for each frequency of the visual perturbation and constructed a non-parametric curve that consisted of the collection of these gains and phase values. All algebra done with these complex valued transforms was done by converting the gain and phase into a single complex number and substituting into the expressions we derive below.
To derive the expressions for the head response under different sensory feedback conditions, we started by considering the body-free case where sources of feedback are present. Head motion can be written as the sum of sensory error multiplied by the visual transform and body motion multiplied by the mechanosensory transform based on Figure 2A:
where is equivalent to the visual perturbation subtracted by the fly’s gaze (sum of and ):
This framework omits the role of neck proprioceptive feedback, as the neck sensory system is intact in both body-free and body-fixed flies. Substituting Equation 17 into Equation 16 yields:
where we can solve for to obtain Equation 5. Normalizing by yields the closed-loop transform from to that we show in Figure 4:
Because the body also has its own associated visual and mechanosensory transforms and (Figure 2A), we could remove the term from Equation 5 and Equation 19 by substituting and . However, the resulting expression is lengthy and does not provide intuitive insights into the different sources of feedback as in Equation 5 and Equation 19 , therefore we chose not to include it.
To obtain the body-fixed closed-loop transform corresponding to Equation 6, we set in Equation 19:
To obtain the closed-loop transform without body mechanosensory feedback (body and head visual feedback only) we set in Equation 19:
To obtain the closed-loop transform without body visual feedback (body mechanosensory feedback and head visual feedback only) we modified Equation 17 such that and re-derived Equation 19, effectively removing the second term:
However, there is an interesting trick where we use the transform from to (, see Figure 5A) to simplify this expression and remove the term. can be written equivalently to Equation 16 as:
By substituting Equation 17 (with ) into Equation 23 and solving for , we obtain Equation 9 which we can normalize by to obtain:
When making predictions using Equation 7, Equation 9, and Equation 12, we used data-driven methods rather than fitting closed-form transfer function models to the data. Thus, our transforms in Figures 4—6 do not explicitly assume a model order.
Data availability
All code and data is available on Penn State ScholarSphere at this link: https://doi.org/10.26207/qpxv-5v60.
-
ScholarsphereUnraveling nested feedback loops in insect gaze stabilization (Data for manuscript).https://doi.org/10.26207/qpxv-5v60
References
-
BookFeedback Systems: An Introduction for Scientists and EngineersPrinceton University Press.
-
Proprioceptive feedback determines visuomotor gain in DrosophilaRoyal Society Open Science 3:150562.https://doi.org/10.1098/rsos.150562
-
Controlling roll perturbations in fruit fliesJournal of the Royal Society, Interface 12:105.https://doi.org/10.1098/rsif.2015.0075
-
A comparison of visual and haltere-mediated feedback in the control of body saccades in Drosophila melanogasterThe Journal of Experimental Biology 209:4597–4606.https://doi.org/10.1242/jeb.02583
-
Visual stimulation of saccades in magnetically tethered DrosophilaThe Journal of Experimental Biology 209:3170–3182.https://doi.org/10.1242/jeb.02369
-
When is vestibular information important during walking?Journal of Neurophysiology 92:1269–1275.https://doi.org/10.1152/jn.01260.2003
-
CircStat: A MATLAB toolbox for circular statisticsJournal of Statistical Software 31:10.https://doi.org/10.18637/jss.v031.i10
-
BookEye‐Head coordinationIn: Greger R, editors. Comprehensive Physiology. Wiley. pp. 1321–1336.https://doi.org/10.1002/cphy
-
Hybrid visual control in fly flight: insights into gaze shift via saccadesCurrent Opinion in Insect Science 42:23–31.https://doi.org/10.1016/j.cois.2020.08.009
-
Mechanisms of punctuated vision in fly flightCurrent Biology 31:4009–4024.https://doi.org/10.1016/j.cub.2021.06.080
-
Visual control of flight behaviour in the hoverflysyritta pipiens LJournal of Comparative Physiology ? A 99:1–66.https://doi.org/10.1007/BF01464710
-
Task-level control of rapid wall following in the american cockroachThe Journal of Experimental Biology 209:1617–1629.https://doi.org/10.1242/jeb.02166
-
Feedback control as a framework for understanding tradeoffs in biologyIntegrative and Comparative Biology 54:223–237.https://doi.org/10.1093/icb/icu050
-
Timing precision in fly flight control: integrating mechanosensory input with muscle physiologyProceedings. Biological Sciences 287:20201774.https://doi.org/10.1098/rspb.2020.1774
-
Haltere-mediated equilibrium reflexes of the fruit fly, Drosophila melanogasterPhilosophical Transactions of the Royal Society of London. Series B, Biological Sciences 354:903–916.https://doi.org/10.1098/rstb.1999.0442
-
The aerodynamics and control of free flight manoeuvres in drosophilaPhilosophical Transactions of the Royal Society B 371:20150388.https://doi.org/10.1098/rstb.2015.0388
-
The influence of sensory delay on the yaw dynamics of a flapping insectJournal of the Royal Society, Interface 9:1685–1696.https://doi.org/10.1098/rsif.2011.0699
-
Dipteran insect flight dynamics. part 1 longitudinal motion about hoverJournal of Theoretical Biology 264:538–552.https://doi.org/10.1016/j.jtbi.2010.02.018
-
Dipteran insect flight dynamics. part 2: lateral-directional motion about hoverJournal of Theoretical Biology 265:306–313.https://doi.org/10.1016/j.jtbi.2010.05.003
-
Haltere afferents provide direct, electrotonic input to a steering motor neuron in the blowfly, CalliphoraThe Journal of Neuroscience 16:5225–5232.
-
Convergent mechanosensory input structures the firing phase of a steering motor neuron in the blowfly, CalliphoraJournal of Neurophysiology 82:1916–1926.https://doi.org/10.1152/jn.1999.82.4.1916
-
The function of the halteres of flies (diptera)Proceedings of the Zoological Society of London A109:69–78.https://doi.org/10.1111/j.1096-3642.1939.tb00049.x
-
The aerodynamics of hovering flight in DrosophilaThe Journal of Experimental Biology 208:2303–2318.https://doi.org/10.1242/jeb.01612
-
Multisensory systems integration for high-performance motor control in fliesCurrent Opinion in Neurobiology 20:347–352.https://doi.org/10.1016/j.conb.2010.02.002
-
Controlling free flight of a robotic fly using an onboard vision sensor inspired by insect ocelliJournal of the Royal Society, Interface 11:20140281.https://doi.org/10.1098/rsif.2014.0281
-
BookThe Vestibular SystemOxford University Press.https://doi.org/10.1093/acprof:oso/9780195167085.001.0001
-
Central gating of fly optomotor responsePNAS 107:20104–20109.https://doi.org/10.1073/pnas.1009381107
-
Evolution of biological image stabilizationCurrent Biology 26:R1010–R1021.https://doi.org/10.1016/j.cub.2016.08.059
-
BookRoll-stabilization during flight of the blowfly’s head and body by mechanical and visual cuesIn: Varjú D, Schnitzler HU, editors. Localization and Orientation in Biology and Engineering. Berlin, Heidelberg: Springer. pp. 121–134.
-
Mechanosensory control of compensatory head roll during flight in the blowflycalliphora erythrocephala MeigJournal of Comparative Physiology A 163:151–165.https://doi.org/10.1007/BF00612425
-
Nonlinear integration of visual and haltere inputs in fly neck motor neuronsThe Journal of Neuroscience 29:13097–13105.https://doi.org/10.1523/JNEUROSCI.2915-09.2009
-
Flight activity alters velocity tuning of fly motion-sensitive neuronsThe Journal of Neuroscience 31:9231–9237.https://doi.org/10.1523/JNEUROSCI.1138-11.2011
-
Representation of haltere oscillations and integration with visual inputs in the fly central complexThe Journal of Neuroscience 39:4100–4112.https://doi.org/10.1523/JNEUROSCI.1707-18.2019
-
When to use cascade controlIndustrial & Engineering Chemistry Research 29:2163–2166.https://doi.org/10.1021/ie00106a033
-
Eye movements in man and other animalsVision Research 162:1–7.https://doi.org/10.1016/j.visres.2019.06.004
-
Neural control and precision of flight muscle activation in DrosophilaJournal of Comparative Physiology. A, Neuroethology, Sensory, Neural, and Behavioral Physiology 203:1–14.https://doi.org/10.1007/s00359-016-1133-9
-
The synergy between neuroscience and control theory: the nervous system as inspiration for hard control challengesAnnual Review of Control, Robotics, and Autonomous Systems 3:243–267.https://doi.org/10.1146/annurev-control-060117-104856
-
The neck motor system of the flycalliphora erythrocephalaJournal of Comparative Physiology A 160:225–238.https://doi.org/10.1007/BF00609728
-
Sensory processing within cockroach antenna enables rapid implementation of feedback control for high-speed running maneuversThe Journal of Experimental Biology 218:2344–2354.https://doi.org/10.1242/jeb.118604
-
Multimodal integration across spatiotemporal scales to guide invertebrate locomotionIntegrative and Comparative Biology 61:842–853.https://doi.org/10.1093/icb/icab041
-
Body saccades of Drosophila consist of stereotyped banked turnsThe Journal of Experimental Biology 218:864–875.https://doi.org/10.1242/jeb.114280
-
Haltere mechanosensory influence on tethered flight behavior in DrosophilaThe Journal of Experimental Biology 218:2528–2537.https://doi.org/10.1242/jeb.121863
-
Cross-Modal influence of mechanosensory input on gaze responses to visual motion in DrosophilaThe Journal of Experimental Biology 220:2218–2227.https://doi.org/10.1242/jeb.146282
-
The halteres of the blowfly CalliphoraJournal of Comparative Physiology A 173:293–300.https://doi.org/10.1007/BF00212693
-
The halteres of the blowfly calliphora II. three-dimensional organization of compensatory reactions to real and simulated rotationsJournal of Comparative Physiology A 175:695–708.https://doi.org/10.1007/BF00191842
-
Insect wing damage: causes, consequences and compensatory mechanismsThe Journal of Experimental Biology 223:jeb215194.https://doi.org/10.1242/jeb.215194
-
Haltere and visual inputs sum linearly to predict wing (but not gaze) motor output in tethered flying drosophilaProceedings. Biological Sciences 288:20202374.https://doi.org/10.1098/rspb.2020.2374
-
A modular display system for insect behavioral neuroscienceJournal of Neuroscience Methods 167:127–139.https://doi.org/10.1016/j.jneumeth.2007.07.019
-
BookAircraft and Rotorcraft System IdentificationReston ,VA: American Institute of Aeronautics and Astronautics.https://doi.org/10.2514/4.861352
-
Active and passive stabilization of body pitch in insect flightJournal of the Royal Society, Interface 10:20130237.https://doi.org/10.1098/rsif.2013.0237
-
Stimulus predictability mediates a switch in locomotor smooth pursuit performance for eigenmannia virescensThe Journal of Experimental Biology 214:1170–1180.https://doi.org/10.1242/jeb.048124
-
A comparative approach to closed-loop computationCurrent Opinion in Neurobiology 25:54–62.https://doi.org/10.1016/j.conb.2013.11.005
-
Fly eyes are not still: a motion illusion in Drosophila flight supports parallel visual processingThe Journal of Experimental Biology 223:jeb212316.https://doi.org/10.1242/jeb.212316
-
Angular acceleration, compensatory head movements and the halteres of flies (Lucilia serricata)Journal of Comparative Physiology ? A 136:361–367.https://doi.org/10.1007/BF00657358
-
A comparison of visual and haltere-mediated equilibrium reflexes in the fruit fly Drosophila melanogasterThe Journal of Experimental Biology 206:295–302.https://doi.org/10.1242/jeb.00075
-
Summation of visual and mechanosensory feedback in Drosophila flight controlThe Journal of Experimental Biology 207:133–142.https://doi.org/10.1242/jeb.00731
-
Convergence of visual, haltere, and prosternai inputs at neck motor neurons of Calliphora erythrocephalaCell and Tissue Research 240:601–615.https://doi.org/10.1007/BF00216350
-
Dynamic modulation of visual and electrosensory gains for locomotor controlJournal of the Royal Society, Interface 13:118.https://doi.org/10.1098/rsif.2016.0057
-
New experimental approaches to the biology of flight control systemsThe Journal of Experimental Biology 211:258–266.https://doi.org/10.1242/jeb.012625
-
Vision-based flight control in the hawkmoth hyles lineataJournal of the Royal Society, Interface 11:20130921.https://doi.org/10.1098/rsif.2013.0921
-
Can a fly ride a bicycle?Philosophical Transactions of the Royal Society of London. Series B 337:261–269.https://doi.org/10.1098/rstb.1992.0104
Decision letter
-
Stephanie E PalmerReviewing Editor; University of Chicago, United States
-
Ronald L CalabreseSenior Editor; Emory University, United States
In the interests of transparency, eLife publishes the most substantive revision requests and the accompanying author responses.
Decision letter after peer review:
Thank you for submitting your article "Unraveling nested feedback loops in insect gaze stabilization: Mechanosensory feedback actively damps visually guided head movements in fly flight" for consideration by eLife. Your article has been reviewed by 3 peer reviewers, one of whom is a member of our Board of Reviewing Editors, and the evaluation has been overseen by Ronald Calabrese as the Senior Editor. The reviewers have opted to remain anonymous.
The reviewers have discussed their reviews with one another, and the Reviewing Editor has drafted this to help you prepare a revised submission.
All of the reviewers enjoyed this paper but thought that some revisions to the presentation of results and the discussion would improve the manuscript. New experiments are discussed in the individual reviews but are not required for this revision. However, the reviewers all felt that including alternate paradigms, hypotheses, and the experiments that do or could distinguish them are crucial adds to the text. Those and other essential elements are summarized here, with individual reviews included at the end:
Essential revisions:
1) It is important to distinguish this model from prior ones in flies, from ones in vertebrates, and from other potential models that could account for the data. This kind of hypothesis testing of model architectures seems like it would add a lot to the paper, especially if you could rule out classes of models and suggest multiple alternative models consistent with your data (and other data in the field). Please see R3's comments along these lines, especially.
2) Issues with the presentation of the results:
(2a) Presentation issues should be addressed to clarify experiments and what each is doing/testing. Reviewers found some of the figures hard to follow, which was surprising given what seemed like relatively straightforward modeling. Please see R2's comments along these lines, in particular.
(2b) All reviewers found the presentation of the nested versus feedback architecture confusing, on different levels. Definitely clarify if dissecting this is an assertion from the outset (and if so, please modify that claim according to the detailed feedback from R3 and R3), or a hypothesis that is being tested. If the latter, please make it easier to read out the weight of the evidence supporting the nested feedback hypothesis, along the lines of R1's comments.
Reviewer #1 (Recommendations for the authors):
This paper aims to dissect the structure of feedback control in the stabilization of gaze when both the external world and the body and head are in motion. To achieve this, sensory-motor systems must integrate visual and self-motion cues, but the precise structure of that integration is not generally known in invertebrate systems. The authors focus on the fly as a model system, where previous work establishes a firm grounding for the results but gaps in knowledge of how canonical experimental manipulations, e.g. anchoring the body, affect motor responses still abound. Using an elegant experimental design where the same visual inputs are delivered during body-fixed and body-free tethered flight, the authors are able to quantify how gaze stabilization is impacted by the two forms of feedback. The work reveals that visual feedback shifts the scale of head movements when the external world moves at different frequencies, but that the self-motion cues from body rotations serve to dampen head movements and are nested within the visual control feedback loop. The nonlinearity in this nested control system is quantified convincingly in the paper.
Main strengths:
– The experimental design and analyses are well-motivated and executed.
– There are clear differences between the head movements and frequencies responses to external visual perturbations in the head-fixed and head-free conditions.
– The proposed model accounts for the empirical data in the two scenarios nicely.
Main weaknesses:
– The strength of the evidence for the differentiation between the two feedback schemes was not clear, and Figures 4 and 5 were hard to follow without more information.
– It was not clear if the model proposed is unique as opposed to simply sufficient for explaining the empirical data.
The work will be of interest to motor and systems neuroscientists who study feedback control, across a broad range of species. Biomechanics researchers will benefit from the framework laid out here and this will inspire future work to uncover the possible mechanisms of this control. Beyond biology, engineers and robotics researchers will take interest in this kind of nested feedback control, for the design of bio-inspired robotic systems.
There is a strong assumption about the analytical form of the feedback gain control (G/(1+G)), and this needs a sentence at least of justification and background in the Results.
Figures 4 and 5 highlight the main results of the work, but it was hard to figure out the strength of the evidence for the nested control topology from the figures. It would greatly enhance the broader impact of the work if these figures were made more intuitive for the reader. Perhaps the figures could start by showing a cartoon of what the results should look like in the extreme case of each feedback scenario and weighting, to set expectations.
Are there other options for the control system that would produce different results in the body-fixed versus body-free flies? It seems like this isn't the only feedback control scheme possible, so a more careful discussion of why the one proposed might be the unique solution to the problem and match the data is crucial.
Something needs to be said in the Discussion about how this adds to what we already knew from the primate literature about nested VOR feedback within OKR feedback. Does this new work point to new mechanisms? In the OKR, there's been good work showing that similar feedback is achieved in primates and zebrafish, but with very different circuitry. Can similarly crisp claims be highlighted here?
Are there new experiments suggested by these results in other species that could broaden the impact of work in the future?
Reviewer #2 (Recommendations for the authors):
In this work, the authors present a model for mechanosensory feedback nested inside a visual feedback loop, both controlling body and head yaw rotations. Using a variety of experiments, they fit this model to behavioral data in the fruit fly, where head and body yaw rotations can be easily measured, and in some cases, feedback can be manipulated. They use this data to fit their model and draw conclusions about how different feedback loops interact to stabilize the gaze in the fly.
The strength of this paper is in its rigorous approach to modeling the feedback in the fly's interactions with the visual world. It manages to fit its model non-parametrically at several different ethologically relevant frequencies of feedback. The comparisons of behavior with and without mechanosensory feedback are illuminating, as is the comparison of voluntary with involuntary mechanical feedback. One weakness of the paper is in its presentation, which can be a little opaque for non-specialists in control theory.
This paper provides a methodology for dissecting how different feedback systems interact and combine to jointly control behavior. While the specific manipulations available in the fly are not universally available, the approach seems likely to be useful for investigating many systems.
Overall, this work looks well done and contributes valuably to understanding how head and body feedback systems work in tandem to stabilize gaze in flies. Most of my major comments relate to the presentation.
Major comments
1) In the introduction, it would help if the authors laid out a little more about what's known and not known, and what precisely this paper is adding to the literature. For instance, the authors state that it's already known that mechanosensory feedback represents nested feedback inside the visual feedback loop. So what's left is merely fitting the model to data? Or are there alternative models that could be tested and ruled out with this data? (If there are, I think the framework of testing alternatives could be powerfully convincing about how predictive this particular model is.) At the end of the introduction, I was left puzzled about what the authors were adding.
2) The stimulus pattern should be defined. Pictures show a square wave grating; is this accurate? Does it matter? What was the wavelength? It looks like a 30 d period or so from the illustrations, which would put maximum temporal frequencies of the moving pattern at ~250 d/s / (30 d) = 8 Hz, which is about right for maximally driving optomotor responses.
Questions:
a. The perturbation signal R is a displacement but is measured presumably as a velocity by the eyes, and the direction-selective signal from the eye is a nonlinear function of velocity. If the tuning of the velocity signal is different for guiding body vs. head movements, does that matter or does that fit easily into this theory? In the presented model, there's only one single visual feedback signal to both body and head.
b. In the fastest oscillating stimuli, the pattern only moves back and forth by 2 pixels or so, and I believe these LEDs have something like 8 brightness levels. Is the intended stimulus really accurately captured by this display?
3) The model section of the methods should be clearer about what the different signals and coefficients are. As I understand it, everything is complex, so the products represent both gain and phase shifts of sinusoids, represented as complex numbers. It would be helpful to define why R should be thought of as displacement rather than velocity, and whether H, B, and G represent angles or angular velocities. Head angle is relative to the body, so angle seems reasonable, but I'd expect body orientation signals to be angular velocities or even accelerations. This might all not matter since it's all in a linear framework, but I think this could nonetheless be made clearer to non-specialists by defining the variables and terminology more explicitly. In the text, there's a reference to a complex s, which I assume is part of the integrand for a Laplace transform, but this could be spelled out more clearly or not mentioned at all since Laplace transforms are otherwise avoided. Then these gain and phase shifts are computed for each frequency of the stimulus, and non-parametric curves are found for each complex coefficient.
4) There's at least one alternative way to break the feedback here, and I'm curious about why it wasn't used to test or fit models. Instead of breaking the mechanosensory feedback loop, one could leave it in place, and instead, place flies in a virtual open loop, so that there is no visual feedback from the behaviors. It might be hard to track the head in real time to do this, but I'm interested to know if there are tests of the theory that could result from this sort of perturbation to the system. Along the same lines, gluing the head to the thorax would remove one source of gaze feedback and could be used to test the model for body movements. Are these interesting tests to do? (I'm not necessarily asking for these experiments.)
Reviewer #3 (Recommendations for the authors):
The goal of this paper is to use the fruit fly Drosophila melanogaster to assess the relative contributions of vision and mechanosensory feedback in controlling head motion about the vertical, or yaw, axis. The authors perform a set of behavioral experiments comparing flies that are free to rotate in the yaw plane with rigidly tethered flies, using a control theoretic framework to make quantitative predictions about the importance of each sensory modality. They propose a model where mechanosensory feedback is nonlinearly integrated with visual feedback to control head steering, but only in the presence of whole-body rotations.
Overall, I find the paper well-written and the data very nicely presented. I appreciate the authors' formal use of control theory to make algebraic predictions about how the flies should respond to each perturbation and think this work adds a great deal to understanding the differences between free and tethered flight. I also like the conceptual approach of comparing parallel and nested sensory fusion problems in locomotion. That being said, I do have some major concerns about the approach that needs to be seriously addressed.
Control model and "eliminating" haltere feedback
This paper compares gaze stabilization in flies that can freely rotate about the yaw axis with those that are rigidly tethered. Crucially, in figure 2A, haltere feedback is presented as being a nested feedback loop that is only the result of the animal's body mechanics. In addition, the legend for 2C states, "Note that contributions of body visual and mechanosensory feedback are no longer present and all nested feedback is gone." In light of recent work, specifically Dickerson et al. 2019, I do not think the authors' view on either matter is correct. As that paper shows, the haltere is providing constant input to the wing steering system-even in the absence of body rotations (It is also worth noting that Fayazzuddin and Dickinson 1999 proposed a model of wing steering muscle function where the wing and haltere provide constant, rhythmic input). Those experiments relied on imaging from the haltere axon terminals in the brain that likely synapse onto neck motor neurons that help control gaze (Strausfeld and Seyan 1989). Moreover, that feedback is partially under visual control; the haltere steering muscles change the trajectory of the haltere in the presence of visual input alone, modulating the feedback it provides to the wing steering system. I am not sure if that makes the haltere system parallel or nested with the visual system, but it certainly means that haltere feedback is not solely due to body mechanics. More importantly, this knowledge of physiology means that in a rigidly tethered fly, the authors cannot fully eliminate haltere input. This has tremendous implications for their modeling efforts, as they can never fully bring Ghead,M to zero. This may explain why, in Figure 4, body visual feedback alone cannot account for changes in head gain. It also means that a diagram like Figure 5B is essentially not possible in an intact fly, as the haltere signal is ever-present.
Proposed neural architecture
The authors propose a model of head stabilization in which the visual system sends motor commands to the neck in parallel with a gating command to the haltere that is only present during body motion. To me, this is essentially the "control-loop" hypothesis, proposed by Chan et al. 1998 and confirmed by Dickerson et al. 2019. In that model, the halteres provide continuous, wingbeat-synchronous feedback during flight. As the fly takes visual input, the haltere steering muscle motor neurons receive commands relayed by the visual system, altering the haltere's motion. This, in turn, recruits more campaniform sensilla for each wing stroke, which fire at different preferred phases from those providing the initial rhythm signal. Then, due to the haltere's direct, excitatory connection with the wing steering muscles, this changes the timing or recruitment of the wing steering system, changing aerodynamic forces and the fly's trajectory. This suggests that the haltere's gyroscopic sensing is an epiphenomenon that coopts its likely ancestral role in regulating the timing of the wing steering system, rather than the other way around. Again, whether this means that the visual → haltere connection is parallel or nested within the visual loop proposed by the authors, I am not certain, though I lean toward the former. Additionally, it is crucial to note that the haltere has collateral projections to the neck motor centers. Thus, as the visual system manipulates haltere kinematics and mechanosensory feedback, the haltere is controlling head motion in a reciprocal fashion, even when there are no imposed body motions. Even the nonlinear gating of neck motor neurons the authors note here is not entirely in keeping with the model proposed by Huston and Krapp 2009. There, the presence of haltere beating or visual stimulus alone was not enough to cause the neck MNs to fire. However, simultaneous haltere beating and visual stimulus did, implying that the fly need only be in flight (or walking, in the case of Calliphora) for the halteres to help control head motion; Coriolis forces due to body rotations imposed or otherwise, need not be present. The only difference I can see between what the authors propose and the control-loop hypothesis is that they focus on the head (which, again, is covered by the revised model of Dickerson et al.) and that the nonlinear damping gate requires body motion (which is inconsistent with the findings of Huston and Krapp).
I think the most critical change is rethinking the control model of visual and mechanosensory feedback in light of our understanding of the haltere motor system. As noted earlier, the experiments with rigidly tethered flies do not fully eliminate haltere feedback, which greatly impacts the math used to make predictions about how the animals respond to various perturbations. I recognize this requires a severe overhaul of the manuscript, but my concern is that by considering the haltere as merely a passive gyroscopic sensor leaves out a number of potential explanations for the data in Figures 4 and 5. Additionally, the authors need to think hard about whether the haltere is controlled in parallel or nested with the visual system, given that they have a reciprocal relationship even in the case of a rigidly tethered fly.
I was rather surprised in the section about active damping of head saccades that there was almost no mention of the recent work by Kim et al. 2017 showing that head motion during saccades seems to follow a feedforward motor program (or Strausfeld and Seyan's 1988 (?) work detailing how vision and haltere info combine to help control head motion). Furthermore, the head velocities for body-free and rigidly tethered flies seem similar, which points to it being a feedforward motor program, a la Kim et al. If you subtract body displacement from the free-rotating head motion, do you get a similar result? That would hint that head isn't overcompensating during body-fixed experiments and is driven more reflexively, as proposed in the discussion. I would also recommend looking at Bartussek and Lehmann 2017 for the impact of haltere mechanosensory input on 'visuomotor' gain, or the work from the Fox lab.
Finally, the authors either need to detail how their model is distinct from the control-loop hypothesis or back off their claim of novelty and show that their work lends further evidence to that model. I would also prefer if the figure panel for the model is either more anatomically accurate or stuck with the block diagram framing of information flow.
https://doi.org/10.7554/eLife.80880.sa1Author response
Reviewer #1 (Recommendations for the authors):
[…]
There is a strong assumption about the analytical form of the feedback gain control (G/(1+G)), and this needs a sentence at least of justification and background in the Results.
Because the system we consider in this manuscript has nested feedback, the form does not fully describe our model structure (as this only applies to the traditional one-sensor control system block diagram). If one were to consider as the transform from sensory error to head motion (which contains the nested feedback loop in our framework), then we can use this expression to describe our model. In this case, the assumption is about the visual feedback, which we assume has a gain of -1. We believe this is a valid assumption because optic flow is inversely proportional to the motion of the eyes relative to the world. We describe in more detail why we believe this model structure is appropriate below. We have also added a line clarifying this idea when we introduce our model.
Figures 4 and 5 highlight the main results of the work, but it was hard to figure out the strength of the evidence for the nested control topology from the figures. It would greatly enhance the broader impact of the work if these figures were made more intuitive for the reader. Perhaps the figures could start by showing a cartoon of what the results should look like in the extreme case of each feedback scenario and weighting, to set expectations.
We agree that Figure 4–5 could benefit from being made more intuitive and have made multiple changes to make the presentation clearer.
For Figure 4, the baseline feedback case where there is only head visual feedback (purple curve) is what we would expect the data to look like in every experiment/prediction if body visual and mechanosensory feedback had no effect. Thus, we now show all the head data in Figure 4 with respect to this curve, which allows for more explicit comparisons. We have also updated the legend and now use cartoons to illustrate the effect of the different types of feedback.
For Figure 5D, we have added the baseline prediction for the ratio of the if body mechanosensory feedback had no effect (gain = 1, phase = 0), and explicitly indicated that values <1 mean that head motion is damped by mechanosensory feedback.
Are there other options for the control system that would produce different results in the body-fixed versus body-free flies? It seems like this isn't the only feedback control scheme possible, so a more careful discussion of why the one proposed might be the unique solution to the problem and match the data is crucial.
In short, yes. A number of models with increasingly complex structures could be fit to our empirical data. However, we assert a parsimonious model structure that is well-suited for mathematically teasing apart the roles of the various sensory modalities during fly gaze stabilization. We believe that there is strong evidence supporting of our proposed model structure.
The match between our prediction of the effects of body visual feedback on the head response and the experimental replay experiment data (see Figure 4C, tan vs gray) is the most compelling evidence that supports our proposed model of visual feedback, as well as implying linearity. In a way, this is expected because the gain of the visual feedback loop (-1) is based on physics, i.e., moving one way elicits optic flow in an equal and opposite direction. The same visual feedback structure has been applied to model visuomotor tasks in other animals such as fish and moths, and has been shown to be similarly linear (Roth et al., 2011; Sponberg et al., 2015). To our knowledge, there has not been other models proposed for visual feedback in flies or other animals that would provide a more parsimonious explanation for the data.
In comparison to the structure of visual feedback, how flies integrate mechanosensory feedback during self-motion is slightly less clear. However, the nested structure of mechanosensory feedback is an inherent property of visually elicited behaviors—meaning that a visual stimulus elicits movement (due to the gaze stabilization reflex), which only then activates mechanosensory feedback, thus mechanosensory feedback is nested in this context. For clarity, the nested feedback architecture is not a hypothesis, but an assertion, when locomotion is reflexively driven by visual inputs. Our model is broadly consistent with previous work which suggested an inner-loop (nested) structure for haltere feedback (Elzinga et al., 2012). The model structure is also consistent with the role of antenna feedback in damping groundspeed control in flies (Fuller et al., 2014).
How mechanosensory is combined with visual information in the brain is not fully resolved. Seminal work showed that there is feedback from the halteres to the flight control muscles, even in the absence of visual inputs (Dickinson, 1999; Fayyazuddin and Dickinson, 1996). Furthermore, visual and haltere inputs due to body motion sum when presented together in rigidly tethered flies, although wing responses (like head responses we measured) were somewhat uncoordinated about the yaw axis (Sherman and Dickinson, 2004, 2003). A limitation of these prior studies is that they considered haltere inputs that were externally generated and in open-loop (flies were rigidly tethered to a motor) so it is difficult to say whether the same topology applies to gyroscopic haltere inputs due to self-motion, i.e. nested feedback. However, these data from prior studies do support the summing of visual and mechanosensory inputs at the neural level, which we maintain in our model structure (as outlined in Figure 2A). A major contribution of our work is the synthesis of a parsimonious linear model that can capture the empirical data.
For transparency, the parallel control topology we show in Figure 1A is not a permissible model structure for our study because it explicitly assumes that both sensory systems (i.e., visual and mechanosensory) receive the same external reference input. The general form of this model (based on Figure 1A) is:. In this case, even if one of the sensory systems is abolished (for example vision in flies):, there will be a nonzero response to the refence input . This parallel model is excellent for examining a fly’s response to coupled visual and haltere inputs (e.g., due to a gust of wind, etc.), but it is not appropriate for our analysis because the fly’s motion is reflexively driven by visual inputs. We present the parallel model to distinguish the nested model structure from prior models, e.g., (Roth et al., 2016).
In the experimental paradigm we employed, there is no external reference for the halteres, thus any feedback must be nested (meaning that there will not be any haltere feedback due to body motion unless visual motion first elicits body motion). This nested model takes the form (based on Figure 2B):
, where abolishing leads to , no matter what the input is. We now explicitly state from the onset that the nested feedback architecture we employ is an assertion, not a hypothesis.
Something needs to be said in the Discussion about how this adds to what we already knew from the primate literature about nested VOR feedback within OKR feedback. Does this new work point to new mechanisms? In the OKR, there's been good work showing that similar feedback is achieved in primates and zebrafish, but with very different circuitry. Can similarly crisp claims be highlighted here?
To our knowledge, by and large the primate literature on the VOR/OKR considers these visual and mechanosensory feedback loops summing in a parallel topology (due to the experimental design)—as opposed to mechanosensory signals being nested with visual feedback (reviewed in (Goldberg et al., 2012)). Further, to our knowledge, only a select few studies in primates have considered nested mechanosensory feedback (Schweigart et al., 1997), but have not attempted to unravel the contributions of the different feedback modalities in shaping the control of the head/eyes in the way that we have here. We now briefly discuss these ideas in the introduction and have added a few sentences in the discussion outlining the new contributions of our work.
Are there new experiments suggested by these results in other species that could broaden the impact of work in the future?
Yes, there are absolutely some interesting experiments focused on uncovering the role of nested mechanosensory/proprioceptive feedback that could be carried out in other species. Indeed, most animals have proprioceptive sensory systems that likely support higher level behaviors. In insects without halteres, such as moths, the antennae are thought to fulfill this role (Sane et al., 2007). We have added a Discussion section titled “Nested proprioception across phyla” considering these ideas.
Reviewer #2 (Recommendations for the authors):
[…]
Overall, this work looks well done and contributes valuably to understanding how head and body feedback systems work in tandem to stabilize gaze in flies. Most of my major comments relate to the presentation.
Major comments
1) In the introduction, it would help if the authors laid out a little more about what's known and not known, and what precisely this paper is adding to the literature. For instance, the authors state that it's already known that mechanosensory feedback represents nested feedback inside the visual feedback loop. So what's left is merely fitting the model to data? Or are there alternative models that could be tested and ruled out with this data? (If there are, I think the framework of testing alternatives could be powerfully convincing about how predictive this particular model is.) At the end of the introduction, I was left puzzled about what the authors were adding.
While it is generally accepted that mechanosensory feedback due to self-motion (i.e., nested feedback) is present during visual driven tasks, the structure of this feedback and how it interacts with visual feedback during gaze stabilization in flight is presently unclear. Our work is the first (to our knowledge) to propose a neuromechanical model for the integration of visual and nested mechanosensory feedback based on empirical data and to quantify the effects of nested feedback on gaze stabilization. We agree that the introduction could benefit from clarification and have added a few sentences in the text discussing the novelty of our work.
2) The stimulus pattern should be defined. Pictures show a square wave grating; is this accurate? Does it matter? What was the wavelength? It looks like a 30 d period or so from the illustrations, which would put maximum temporal frequencies of the moving pattern at ~250 d/s / (30 d) = 8 Hz, which is about right for maximally driving optomotor responses.
Yes, we displayed a square wave pattern with a spatial wavelength of 30°, which is accurately illustrated in Figure 2. This is specified in the methods section titled “Visual perturbations”. We now clarify in this section that this spatial wavelength with our prescribed visual motion yields mean temporal frequencies of ~5 Hz (with a max of ~8 Hz), which is right around the optimum of the motion vision pathway in Drosophila (Duistermars et al., 2007a).
Questions:
a. The perturbation signal R is a displacement but is measured presumably as a velocity by the eyes, and the direction-selective signal from the eye is a nonlinear function of velocity. If the tuning of the velocity signal is different for guiding body vs. head movements, does that matter or does that fit easily into this theory? In the presented model, there's only one single visual feedback signal to both body and head.
The tuning of the velocity signal is in fact different for guiding the head and body, but this fits quite nicely into our theory. The tuning of the head and body with respect to visual motion is accounted for in their corresponding neural controllers ( and ) shown in Figure 2A. Effectively, and represent how the brain processes differently for the head and body. We have shown in our previous work (Cellini et al., 2022) that is tuned closely to velocity while is actually tuned more strongly to acceleration. This difference in tuning is what explains why the body has a low-pass filter response and the head is more like a high-pass filter, as shown in Figure 2A,C and Figure 3A-B, corroborating our recent work (Cellini et al., 2022).
A nice property of our model and non-parametric approach is that is does not matter what the baseline tuning of the head and body are (velocity sensitive, acceleration sensitive, etc.). We isolate and measure and (which contain and ) by removing mechanosensory feedback in body-fixed flies, and then use the response of body-free flies to work backwards and compute the effects of mechanosensory feedback. This way, our model can account for the differences in tuning between body-free and body-fixed flies, rather than focusing on the tuning itself. We have added a sentence in the text clarifying that and account for differences in tuning between the head and body.
b. In the fastest oscillating stimuli, the pattern only moves back and forth by 2 pixels or so, and I believe these LEDs have something like 8 brightness levels. Is the intended stimulus really accurately captured by this display?
The fastest oscillating stimuli does indeed only move between three pixels (one at the 0 position, and then 3.75°m in each direction). This results in something similar to a square wave being displayed on our flight simulator (see Author response image 1, left). However, when transformed into frequency-domain, the resulting signal is very similar to the idealized one (see Author response image 1, right). We account for any discrepancies in our analysis by using the actual displayed signal as the input to our model. This ensures that we are modeling a fly’s response to the frequency content of what they are actually seeing. We would not expect a fly’s response to be much different between the idealized and actual signals anyway, because a fly’s visual system low-pass filters high-frequency components of visual motion, which would make the square wave appear smoother than it actually is, e.g. see (Duistermars et al., 2007b) for a confirmation experiment. We now clarify in the text that we record the voltage signal from our visual display, which measures the displacement of our stimulus, and use that signal as the input to our model.

Left: the idealized smooth sine wave at the highest frequency designed for our flight simulator (black) vs the actual displayed signal (red).
Note that our flight simulator display has an angular resolution of 3.75°. Right: same as the left, but for the Fast-Fourier Transform (FFT) magnitude of the two signals.
3) The model section of the methods should be clearer about what the different signals and coefficients are. As I understand it, everything is complex, so the products represent both gain and phase shifts of sinusoids, represented as complex numbers. It would be helpful to define why R should be thought of as displacement rather than velocity, and whether H, B, and G represent angles or angular velocities. Head angle is relative to the body, so angle seems reasonable, but I'd expect body orientation signals to be angular velocities or even accelerations. This might all not matter since it's all in a linear framework, but I think this could nonetheless be made clearer to non-specialists by defining the variables and terminology more explicitly. In the text, there's a reference to a complex s, which I assume is part of the integrand for a Laplace transform, but this could be spelled out more clearly or not mentioned at all since Laplace transforms are otherwise avoided. Then these gain and phase shifts are computed for each frequency of the stimulus, and non-parametric curves are found for each complex coefficient.
We fully acknowledge that flies primarily measure the velocity of wide-field visual perturbations with their visual system, as this has been shown extensively in prior work. Therefore, the gain and phase between the body and visual perturbation can be most precisely thought of as a ratio of velocities in frequency domain. However, a nice mathematical property of linear frequency-domain system identification is that this ratio is equivalent for displacements, velocities, accelerations, etc. For example, consider the case where if and are defined as displacements. Multiplying them by the complex variable is equivalent to taking their derivatives in frequency domain (i.e., converting them to velocities) so the ratio is then defined as: The term cancels out, so our results would not be affected by converted these signals to velocities, or simply keeping them as displacements. However, computing displacement from velocity involves taking a numerical derivative, which amplifies noise (Van Breugel et al., 2020), thus we prefer to make all calculations using displacements. To make these points clearer, we have clarified in the text the definition of the complex variable and explained our reasoning for making our calculations using displacements instead of velocities.
4) There's at least one alternative way to break the feedback here, and I'm curious about why it wasn't used to test or fit models. Instead of breaking the mechanosensory feedback loop, one could leave it in place, and instead, place flies in a virtual open loop, so that there is no visual feedback from the behaviors. It might be hard to track the head in real time to do this, but I'm interested to know if there are tests of the theory that could result from this sort of perturbation to the system. Along the same lines, gluing the head to the thorax would remove one source of gaze feedback and could be used to test the model for body movements. Are these interesting tests to do? (I'm not necessarily asking for these experiments.)
We find this approach and the corresponding experiments tremendously intriguing, as it involves more potential manipulations to the control topology that could provide insight into the inner workings of the feedback system. One could potentially use a virtual/augmented reality system to abolish visual feedback from the head and/or body in real-time, which would correspond to changing the model from Equation 3 (now Equation 5) in the text to any of the three following forms:
1) Removed body visual feedback:
2) Removed head visual feedback:
3) Removed head and body visual feedback:
These expressions make predictions for the corresponding experiments in the virtual reality system and could provide further validation of our model. Our group has just recently developed a system to accomplish real-time virtual reality for the body in the magnetic tether system—although not yet for the head, as this is substantially more challenging to do in real-time than offline due to the need to find the neck joint in each frame for a rotating body. Thus, only the feedback case described by the equation in (1) is possible for us to achieve experimentally at present. (1) corresponds to Equation 6 (now Equation 8) in the text, and the prediction we made about the head response when body visual feedback is removed but mechanosensory feedback is still present (Figure 4D, cyan). Based on this prediction we would expect the head to operate with lower gain than when the body is fixed, but higher gain than when the body is free with natural body visual feedback. We are currently working on another manuscript that is beyond the scope of this study and now briefly discuss how these feedback manipulations fit into our framework within the discussion.
Regarding the second point about fixing the head and how this might affect the control of the body, we have previously published data on these experiments (for the sum-of-sines visual inputs). The data are presented in our previous paper (Cellini et al., 2022), but not integrated with the framework we introduce here. Our previous analysis showed that fixing the head has a modest, but significant, effect on the body gain and phase at high frequencies (where the head is typically the most active), which can be predicted by our control model. While these results support our proposed model and are further evidence for linearity, we believe that including these data/modeling is a bit beyond the scope of the current manuscript and does not directly address the role of nested mechanosensory feedback. Therefore, we prefer to leave this out of the manuscript.
Reviewer #3 (Recommendations for the authors):
The goal of this paper is to use the fruit fly Drosophila melanogaster to assess the relative contributions of vision and mechanosensory feedback in controlling head motion about the vertical, or yaw, axis. The authors perform a set of behavioral experiments comparing flies that are free to rotate in the yaw plane with rigidly tethered flies, using a control theoretic framework to make quantitative predictions about the importance of each sensory modality. They propose a model where mechanosensory feedback is nonlinearly integrated with visual feedback to control head steering, but only in the presence of whole-body rotations.
Overall, I find the paper well-written and the data very nicely presented. I appreciate the authors' formal use of control theory to make algebraic predictions about how the flies should respond to each perturbation and think this work adds a great deal to understanding the differences between free and tethered flight. I also like the conceptual approach of comparing parallel and nested sensory fusion problems in locomotion. That being said, I do have some major concerns about the approach that needs to be seriously addressed.
Control model and "eliminating" haltere feedback
This paper compares gaze stabilization in flies that can freely rotate about the yaw axis with those that are rigidly tethered. Crucially, in figure 2A, haltere feedback is presented as being a nested feedback loop that is only the result of the animal's body mechanics. In addition, the legend for 2C states, "Note that contributions of body visual and mechanosensory feedback are no longer present and all nested feedback is gone." In light of recent work, specifically Dickerson et al. 2019, I do not think the authors' view on either matter is correct. As that paper shows, the haltere is providing constant input to the wing steering system-even in the absence of body rotations (It is also worth noting that Fayazzuddin and Dickinson 1999 proposed a model of wing steering muscle function where the wing and haltere provide constant, rhythmic input). Those experiments relied on imaging from the haltere axon terminals in the brain that likely synapse onto neck motor neurons that help control gaze (Strausfeld and Seyan 1989). Moreover, that feedback is partially under visual control; the haltere steering muscles change the trajectory of the haltere in the presence of visual input alone, modulating the feedback it provides to the wing steering system. I am not sure if that makes the haltere system parallel or nested with the visual system, but it certainly means that haltere feedback is not solely due to body mechanics. More importantly, this knowledge of physiology means that in a rigidly tethered fly, the authors cannot fully eliminate haltere input. This has tremendous implications for their modeling efforts, as they can never fully bring Ghead,M to zero. This may explain why, in Figure 4, body visual feedback alone cannot account for changes in head gain. It also means that a diagram like Figure 5B is essentially not possible in an intact fly, as the haltere signal is ever-present.
We thank the reviewer for the detailed insights into the neurobiology of the haltere-wing system. We agree that our phrasing and terminology regarding “eliminating haltere feedback” could be misleading and needs to be revised.
As the reviewer points out, the halteres have been implicated in two primary functions:
1) as a gyroscope to sense body motion and
2) as a ‘metronome’ that regulates and structures the timing of motor outputs via tonic input (wings, and likely the head as well).
In our work, we have focused exclusively on function (1), as the emphasis of this manuscript is on how body motion influences head movements (from both vision and proprioception). Therefore, when we state that we have eliminated haltere (or mechanosensory) feedback, we are referring to eliminating function (1) of the halteres, i.e. inputs due to body motion.
As the reviewer accurately states, we have not eliminated all the functions of the halteres by fixing the body. This could only be achieved by ablating the halteres themselves (or potentially with genetic silencing), which would result in both functions (1) and (2) being eliminated. While bilaterally removing the halteres is an option in body-fixed flies and has been shown to have a modest effect on head movements (Mureli et al., 2017), this is unfortunately not possible in body-free flies (free or magnetically tethered) because flies immediately become unstable. This means that, even if we were to ablate the halteres in body-fixed flies, it would be difficult to infer if any differences in head responses we observed between body-free and body-fixed flies were due to functions (1) or (2). Keeping the halteres intact in both cases ensures that function (2) is active in both cases and any differences we observe between body-free and body-fixed flies are due to function (1).
Fortunately, even though we cannot remove function (2) of the halteres, we believe our model is still appropriate for understanding the role of function (1) without major alterations to the framework. As the reviewer states, visual motion drives steering commands of the head/wings in parallel with commands to the halteres themselves, regardless of whether body motion occurs. These haltere commands structure the timing of motor output in conjunction with visual inputs. Therefore, we think of function (2) more as a sub-component of the visual controller that mediates optomotor steering than as part of the mechanosensory controller that senses body motion (although haltere movements certainly affect how gyroscopic information is measured). Thus, we can lump function (2) in with the visual controller, rather than the mechanosensory controller, in order to investigate the role of function (1) of the halteres—because function (1) is the only thing that changes between conditions. For clarity, when we fix the body to isolate and measure , function (2) is still present, and thus contained within the dynamics of . This same is active in body-free flies; thus, we can be sure that any differences we see between body-free and body-fixed flies in not due to function (2). While we acknowledge that this means we have not truly isolated the pure visual controller (because function 2) is always present, this does not affect any of our conclusions about function (1) of the halteres (i.e., gyroscopic inputs due to body motion).
To reconcile our model with prior work on function (2) of the halteres, we have added a section in the beginning of the results discussing the dual-function role of the halteres and changed our language throughout to be clear that fixing the body only removes function (1). Specifically, we now specify “…eliminated haltere/mechanosneosry feedback due to body motion…”, to distinguish from haltere feedback due to function (2).
Proposed neural architecture
The authors propose a model of head stabilization in which the visual system sends motor commands to the neck in parallel with a gating command to the haltere that is only present during body motion. To me, this is essentially the "control-loop" hypothesis, proposed by Chan et al. 1998 and confirmed by Dickerson et al. 2019. In that model, the halteres provide continuous, wingbeat-synchronous feedback during flight. As the fly takes visual input, the haltere steering muscle motor neurons receive commands relayed by the visual system, altering the haltere's motion. This, in turn, recruits more campaniform sensilla for each wing stroke, which fire at different preferred phases from those providing the initial rhythm signal. Then, due to the haltere's direct, excitatory connection with the wing steering muscles, this changes the timing or recruitment of the wing steering system, changing aerodynamic forces and the fly's trajectory. This suggests that the haltere's gyroscopic sensing is an epiphenomenon that coopts its likely ancestral role in regulating the timing of the wing steering system, rather than the other way around. Again, whether this means that the visual → haltere connection is parallel or nested within the visual loop proposed by the authors, I am not certain, though I lean toward the former. Additionally, it is crucial to note that the haltere has collateral projections to the neck motor centers. Thus, as the visual system manipulates haltere kinematics and mechanosensory feedback, the haltere is controlling head motion in a reciprocal fashion, even when there are no imposed body motions. Even the nonlinear gating of neck motor neurons the authors note here is not entirely in keeping with the model proposed by Huston and Krapp 2009. There, the presence of haltere beating or visual stimulus alone was not enough to cause the neck MNs to fire. However, simultaneous haltere beating and visual stimulus did, implying that the fly need only be in flight (or walking, in the case of Calliphora) for the halteres to help control head motion; Coriolis forces due to body rotations imposed or otherwise, need not be present. The only difference I can see between what the authors propose and the control-loop hypothesis is that they focus on the head (which, again, is covered by the revised model of Dickerson et al.) and that the nonlinear damping gate requires body motion (which is inconsistent with the findings of Huston and Krapp).
We further thank the reviewer for these insights. It is our understanding that the “control-loop” model proposed by (Chan et al., 1998) primarily refers to the halteres’ role in structuring motor output. This model is well supported by (Dickerson et al., 2019), but does not yet reveal how gyroscopic inputs due to body motion might be modulated by haltere muscles (as all experiments were performed on body-fixed flies and the specific campaniform arrays involved in sensing gyroscopic forces are currently ambiguous). However, the discussion in (Dickerson et al., 2019) sets up some nice hypotheses for how haltere muscles might act to recruit haltere campaniforms of different types—some that are insensitive to gyroscopic forces, and some that are sensitive to gyroscopic forces. We believe our work builds on these hypotheses and provides preliminary evidence that gyroscopic-sensitive inputs might be actively modulated by visual inputs, as flies’ head responses were strikingly different when they were controlling their own body motion vs when body motion was externally imposed. While our data does not provide evidence at the level of neural circuits, we believe our work stands nicely next to (Dickerson et al., 2019) and (Chan et al., 1998) and provides additional hypotheses to be tested. We now include additional discussion in the text (Discussion section: Distinguishing between self-generated and externally generated body motion) on these ideas and reconcile our data with prior models in the literature.
I think the most critical change is rethinking the control model of visual and mechanosensory feedback in light of our understanding of the haltere motor system. As noted earlier, the experiments with rigidly tethered flies do not fully eliminate haltere feedback, which greatly impacts the math used to make predictions about how the animals respond to various perturbations. I recognize this requires a severe overhaul of the manuscript, but my concern is that by considering the haltere as merely a passive gyroscopic sensor leaves out a number of potential explanations for the data in Figures 4 and 5. Additionally, the authors need to think hard about whether the haltere is controlled in parallel or nested with the visual system, given that they have a reciprocal relationship even in the case of a rigidly tethered fly.
See our response above for more detail. From our understanding of the metronome function of the halteres, i.e., providing tonic input, this internal feedback loop should be equally present in both body-free and body-fixed flies, and therefore does not offer a clear justification for why we see damping of head movements in body-free, but not body-fixed, flies. We have clarified in the introduction and the beginning of the results that, while the halteres serve to structure motor output independently of body motion, that any differences we observe in body-free flies strongly suggest that haltere inputs due to body motion (gyroscope forces) are the underlying cause.
I was rather surprised in the section about active damping of head saccades that there was almost no mention of the recent work by Kim et al. 2017 showing that head motion during saccades seems to follow a feedforward motor program (or Strausfeld and Seyan's 1988 (?) work detailing how vision and haltere info combine to help control head motion). Furthermore, the head velocities for body-free and rigidly tethered flies seem similar, which points to it being a feedforward motor program, a la Kim et al. If you subtract body displacement from the free-rotating head motion, do you get a similar result? That would hint that head isn't overcompensating during body-fixed experiments and is driven more reflexively, as proposed in the discussion. I would also recommend looking at Bartussek and Lehmann 2017 for the impact of haltere mechanosensory input on 'visuomotor' gain, or the work from the Fox lab.
Kim et al., 2017 argues that head roll during a head saccade follows a feedforward motor program based on data showing that the head will roll (in addition to yaw) to offset body roll during a saccade, even though there is no body roll in the magnetic tether they used for experiments. We confirmed this result in a recent paper and provided further evidence for the feedforward hypothesis by studying head saccades in a rigid tether where there is no body motion (Cellini et al., 2021). While we agree that this data is consistent with a visually open-loop feedforward motor program, we believe that our data strongly supports the idea that mechanosensory feedback is present during saccades, especially during braking, which is consistent with previous work in a similar paradigm (Bender and Dickinson, 2006). While the peak velocities of head saccades in body-free and body-fixed flies are similar in our data (although statistically different), the most prominent difference is in how long it takes the head to return to baseline after the initial rapid movement (as shown in Figure 7). Furthermore, the amplitude of head saccades in body-free flies is considerably smaller than in body-fixed flies. These stark differences, even in the absence of visual features (see Figure S7), strongly suggest that mechanosensory feedback from body motion underlies this behavior. This is consistent with Kim et al. 2017, as our data still show that head saccades are likely visually open loop (visual features don’t change response). The rapid (5-10ms) response time of mechanosensory feedback is also well within the saccade duration (~50ms), so it is reasonable to assume that mechanosensory information can shape head saccade dynamics, even if they are visually open-loop. All head saccade data are presented in the body reference frame (head relative to the body), so subtracting (or adding) body movement would not change this interpretation. We have clarified in the text that head saccades are visually open-loop, but that mechanosensory feedback likely mediates braking. We now also cite (Kim et al., 2017), (Milde et al., 1987) , and (Strausfeld and Seyan, 1985) in the section discussing saccades. Note that we believe the reviewer was referring to Strausfeld and Seyan's 1985 and/or 1987 work (Milde et al., 1987; Strausfeld and Seyan, 1985), as we could not find a relevant study from 1988.
While (Bartussek and Lehmann, 2016; Lehmann and Bartussek, 2017) and (Kathman and Fox, 2019; Mureli et al., 2017; Mureli and Fox, 2015) present intriguing results describing how local haltere/wing proprioception shapes motor output (likely by modulating visuomotor gain), we feel that because these studies focus on the tonic function of the haltere (all experiments in body-fixed flies), that they do not address the role of the gyroscopic haltere inputs we investigate here. We have added a line in the Results section on saccades clarifying these ideas.
Finally, the authors either need to detail how their model is distinct from the control-loop hypothesis or back off their claim of novelty and show that their work lends further evidence to that model. I would also prefer if the figure panel for the model is either more anatomically accurate or stuck with the block diagram framing of information flow.
See our above comments. We have also modified the panel in Figure 7E to follow our block diagram format.
References
Bartussek J, Lehmann F-O. 2016. Proprioceptive feedback determines visuomotor gain in Drosophila. R Soc Open Sci 3. doi:10.1098/rsos.150562
Bender JA, Dickinson MH. 2006. A comparison of visual and haltere-mediated feedback in the control of body saccades in Drosophila melanogaster. J Exp Biol 209:4597–4606. doi:10.1242/jeb.02583
Cellini B, Mongeau J-M. 2020. Active vision shapes and coordinates flight motor responses in flies. Proc Natl Acad Sci 117:23085–23095. doi:10.1073/pnas.1920846117
Cellini B, Salem W, Mongeau J-M. 2021. Mechanisms of punctuated vision in fly flight. Curr Biol 31:4009-4024.e3. doi:10.1016/j.cub.2021.06.080
Cellini B, Salem W, Mongeau J-MM. 2022. Complementary feedback control enables effective gaze stabilization in animals. Proc Natl Acad Sci 119:e2121660119. doi:https://doi.org/10.1073/pnas.2121660119
Chan WP, Prete F, Dickinson MH. 1998. Visual Input to the Efferent Control System of a Fly’s “Gyroscope.” Science (80- ) 280:289–292. doi:10.1126/science.280.5361.289
Dickerson BH, de Souza AM, Huda A, Dickinson MH. 2019. Flies Regulate Wing Motion via Active Control of a Dual-Function Gyroscope. Curr Biol 29:3517-3524.e3. doi:10.1016/j.cub.2019.08.065
Dickinson MH. 1999. Haltere–mediated equilibrium reflexes of the fruit fly, Drosophila melanogaster. Philos Trans R Soc London Ser B Biol Sci 354:903–916. doi:10.1098/rstb.1999.0442
Duistermars BJ, Chow DM, Condro M, Frye MA. 2007a. The spatial, temporal and contrast properties of expansion and rotation flight optomotor responses in Drosophila. J Exp Biol 210:3218–3227. doi:10.1242/jeb.007807
Duistermars BJ, Reiser MB, Zhu Y, Frye MA. 2007b. Dynamic properties of large-field and small-field optomotor flight responses in Drosophila. J Comp Physiol A Neuroethol Sensory, Neural, Behav Physiol 193:787–799. doi:10.1007/s00359-007-0233-y
Elzinga MJ, Dickson WB, Dickinson MH. 2012. The influence of sensory delay on the yaw dynamics of a flapping insect. J R Soc Interface 9:1685–1696. doi:10.1098/rsif.2011.0699
Fayyazuddin A, Dickinson MH. 1996. Haltere Afferents Provide Direct, Electrotonic Input to a Steering Motor Neuron in the Blowfly, Calliphora. J Neurosci 16:5225–5232. doi:10.1523/JNEUROSCI.16-16-05225.1996
Fuller SB, Straw AD, Peek MY, Murray RM, Dickinson MH. 2014. Flying Drosophila stabilize their vision-based velocity controller by sensing wind with their antennae. Proc Natl Acad Sci 111:E1182–E1191. doi:10.1073/pnas.1323529111
Goldberg JM, Wilson VJ, Cullen KE, Angelaki DE, Broussard DM, Buttner-Ennever J, Fukushima K, Minor LB. 2012. The Vestibular System, The Vestibular System: A Sixth Sense. Oxford University Press. doi:10.1093/acprof:oso/9780195167085.001.0001
Heisenberg M, Wolf R. 1986. Vision in Drosophila. Genetics in Microbehavior. Studies of Brain Function, Volume 12. M. Heisenberg , R. Wolf. Q Rev Biol 61:141–141. doi:10.1086/414849
Kathman ND, Fox JL. 2019. Representation of Haltere Oscillations and Integration with Visual Inputs in the Fly Central Complex. J Neurosci 39:4100–4112. doi:10.1523/JNEUROSCI.1707-18.2019
Kim AJ, Fenk LM, Lyu C, Maimon G. 2017. Quantitative Predictions Orchestrate Visual Signaling in Drosophila. Cell 168:280-294.e12. doi:10.1016/j.cell.2016.12.005
Lehmann F-O, Bartussek J. 2017. Neural control and precision of flight muscle activation in Drosophila. J Comp Physiol A 203:1–14. doi:10.1007/s00359-016-1133-9
Milde JJ, Seyan HS, Strausfeld NJ. 1987. The neck motor system of the fly Calliphora erythrocephala – II. Sensory organization. J Comp Physiol A 160:225–238. doi:10.1007/BF00609728
Mureli S, Fox JL. 2015. Haltere mechanosensory influence on tethered flight behavior in Drosophila. J Exp Biol 218:2528–2537. doi:10.1242/jeb.121863
Mureli S, Thanigaivelan I, Schaffer ML, Fox JL. 2017. Cross-modal influence of mechanosensory input on gaze responses to visual motion in Drosophila. J Exp Biol 220:2218–2227. doi:10.1242/jeb.146282
Roth E, Hall RW, Daniel TL, Sponberg S. 2016. Integration of parallel mechanosensory and visual pathways resolved through sensory conflict. Proc Natl Acad Sci 113:12832–12837. doi:10.1073/pnas.1522419113
Roth E, Sponberg S, Cowan N. 2014. A comparative approach to closed-loop computation. Curr Opin Neurobiol 25:54–62. doi:10.1016/j.conb.2013.11.005
Roth E, Zhuang K, Stamper SA, Fortune ES, Cowan NJ. 2011. Stimulus predictability mediates a switch in locomotor smooth pursuit performance for Eigenmannia virescens. J Exp Biol 214:1170–1180. doi:10.1242/jeb.048124
Sane SP, Dieudonné A, Willis MA, Daniel TL. 2007. Antennal Mechanosensors Mediate Flight Control in Moths. Science (80- ) 315:863–866. doi:10.1126/science.1133598
Schweigart G, Mergner T, Evdokimidis I, Morand S, Becker W. 1997. Gaze Stabilization by Optokinetic Reflex (OKR) and Vestibulo-ocular Reflex (VOR) During Active Head Rotation in Man. Vision Res 37:1643–1652. doi:10.1016/S0042-6989(96)00315-X
Sherman A, Dickinson MH. 2004. Summation of visual and mechanosensory feedback in Drosophila flight control. J Exp Biol 207:133–142. doi:10.1242/jeb.00731
Sherman A, Dickinson MH. 2003. A comparison of visual and haltere-mediated equilibrium reflexes in the fruit fly Drosophila melanogaster. J Exp Biol 206:295–302. doi:10.1242/jeb.00075
Sponberg S, Dyhr JP, Hall RW, Daniel TL. 2015. Luminance-dependent visual processing enables moth flight in low light. Science (80- ) 348:1245–1248. doi:10.1126/science.aaa3042
Stöckl AL, Kihlström K, Chandler S, Sponberg S. 2017. Comparative system identification of flower tracking performance in three hawkmoth species reveals adaptations for dim light vision. Philos Trans R Soc B Biol Sci 372:20160078. doi:10.1098/rstb.2016.0078
Strausfeld NJ, Seyan HS. 1985. Convergence of visual, haltere, and prosternai inputs at neck motor neurons of Calliphora erythrocephala. Cell Tissue Res 240:601–615. doi:10.1007/BF00216350
Van Breugel F Van, Kutz JN, Brunton BW. 2020. Numerical Differentiation of Noisy Data: A Unifying Multi-Objective Optimization Framework. IEEE Access 8:196865–196877. doi:10.1109/ACCESS.2020.3034077
Windsor SP, Taylor GK. 2017. Head movements quadruple the range of speeds encoded by the insect motion vision system in hawkmoths. Proc R Soc B Biol Sci 284:20171622. doi:10.1098/rspb.2017.1622
Wolf R, Voss A, Hein S, Heisenberg M, Sullivan GD. 1992. Can a fly ride a bicycle? Philos Trans R Soc London Ser B Biol Sci 337:261–269. doi:10.1098/rstb.1992.0104
https://doi.org/10.7554/eLife.80880.sa2Article and author information
Author details
Funding
Air Force Office of Scientific Research (FA9550-20-1-0084)
- Jean-Michel Mongeau
Alfred P. Sloan Foundation (FG-2021-16388)
- Jean-Michel Mongeau
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Acknowledgements
We thank Mark Frye and Martha Rimniceanu for valuable comments. This material is based upon work supported by the Air Force Office of Scientific Research (FA9550-20-1-0084) and an Alfred P Sloan Research Fellowship to JMM.
Senior Editor
- Ronald L Calabrese, Emory University, United States
Reviewing Editor
- Stephanie E Palmer, University of Chicago, United States
Version history
- Received: June 7, 2022
- Preprint posted: June 20, 2022 (view preprint)
- Accepted: October 17, 2022
- Accepted Manuscript published: October 19, 2022 (version 1)
- Version of Record published: November 11, 2022 (version 2)
Copyright
© 2022, Cellini and Mongeau
This article is distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 842
- Page views
-
- 152
- Downloads
-
- 2
- Citations
Article citation count generated by polling the highest count across the following sources: Crossref, PubMed Central, Scopus.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Genetics and Genomics
- Neuroscience
A hexanucleotide repeat expansion in C9ORF72 is the most common genetic cause of amyotrophic lateral sclerosis (ALS) and frontotemporal dementia (FTD). A hallmark of ALS/FTD pathology is the presence of dipeptide repeat (DPR) proteins, produced from both sense GGGGCC (poly-GA, poly-GP, poly-GR) and antisense CCCCGG (poly-PR, poly-PG, poly-PA) transcripts. Translation of sense DPRs, such as poly-GA and poly-GR, depends on non-canonical (non-AUG) initiation codons. Here, we provide evidence for canonical AUG-dependent translation of two antisense DPRs, poly-PR and poly-PG. A single AUG is required for synthesis of poly-PR, one of the most toxic DPRs. Unexpectedly, we found redundancy between three AUG codons necessary for poly-PG translation. Further, the eukaryotic translation initiation factor 2D (EIF2D), which was previously implicated in sense DPR synthesis, is not required for AUG-dependent poly-PR or poly-PG translation, suggesting that distinct translation initiation factors control DPR synthesis from sense and antisense transcripts. Our findings on DPR synthesis from the C9ORF72 locus may be broadly applicable to many other nucleotide repeat expansion disorders.
-
- Cell Biology
- Neuroscience
The amyloid beta (Aβ) plaques found in Alzheimer’s disease (AD) patients’ brains contain collagens and are embedded extracellularly. Several collagens have been proposed to influence Aβ aggregate formation, yet their role in clearance is unknown. To investigate the potential role of collagens in forming and clearance of extracellular aggregates in vivo, we created a transgenic Caenorhabditis elegans strain that expresses and secretes human Aβ1-42. This secreted Aβ forms aggregates in two distinct places within the extracellular matrix. In a screen for extracellular human Aβ aggregation regulators, we identified different collagens to ameliorate or potentiate Aβ aggregation. We show that a disintegrin and metalloprotease a disintegrin and metalloprotease 2 (ADM-2), an ortholog of ADAM9, reduces the load of extracellular Aβ aggregates. ADM-2 is required and sufficient to remove the extracellular Aβ aggregates. Thus, we provide in vivo evidence of collagens essential for aggregate formation and metalloprotease participating in extracellular Aβ aggregate removal.