Superior colliculus drives stimulus-evoked directionally biased saccades and attempted head movements in head-fixed mice
Abstract
Animals investigate their environments by directing their gaze towards salient stimuli. In the prevailing view, mouse gaze shifts entail head rotations followed by brainstem-mediated eye movements, including saccades to reset the eyes. These 'recentering' saccades are attributed to head movement-related vestibular cues. However, microstimulating mouse superior colliculus (SC) elicits directed head and eye movements resembling SC-dependent sensory-guided gaze shifts in other species, suggesting that mouse gaze shifts may be more flexible than has been recognized. We investigated this possibility by tracking eye and attempted head movements in a head-fixed preparation that eliminates head movement-related sensory cues. We found tactile stimuli evoke directionally biased saccades coincident with attempted head rotations. Differences in saccade endpoints across stimuli are associated with distinct stimulus-dependent relationships between initial eye position and saccade direction and amplitude. Optogenetic perturbations revealed SC drives these gaze shifts. Thus, head-fixed mice make sensory-guided, SC-dependent gaze shifts involving coincident, directionally biased saccades and attempted head movements. Our findings uncover flexibility in mouse gaze shifts and provide a foundation for studying head-eye coupling.
Data availability
Annotated data and model code have been uploaded to a Dryad repository (https://doi.org/10.7272/Q6V69GTV).
-
A new type of mouse gaze shift is led by directed saccadesDryad Digital Repository, doi:10.7272/dryad.Q6V69GTV.
Article and author information
Author details
Funding
National Institute of Mental Health (DP2MH119426)
- Evan H Feinberg
National Institute of Neurological Disorders and Stroke (R01NS109060)
- Evan H Feinberg
Simons Foundation Autism Research Initiative (574347)
- Evan H Feinberg
Esther A. and Joseph Klingenstein Fund
- Evan H Feinberg
E. Matilda Ziegler Foundation for the Blind
- Evan H Feinberg
Whitehall Foundation
- Evan H Feinberg
Brain and Behavior Research Foundation (25337)
- Evan H Feinberg
Brain and Behavior Research Foundation (27320)
- Evan H Feinberg
Sandler Foundation
- Evan H Feinberg
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Ethics
Animal experimentation: This study was performed in strict accordance with the recommendations in the Guide for the Care and Use of Laboratory Animals of the National Institutes of Health. All animal procedures were approved by the University of California San Francisco Institutional Animal Care and Use Committee (IACUC) (protocol number AN176625), and were conducted in agreement with the Association for Assessment and Accreditation of Laboratory Animal Care (AAALAC).
Copyright
© 2021, Zahler et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 2,759
- views
-
- 364
- downloads
-
- 33
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Computational and Systems Biology
- Neuroscience
Perception can be highly dependent on stimulus context, but whether and how sensory areas encode the context remains uncertain. We used an ambiguous auditory stimulus – a tritone pair – to investigate the neural activity associated with a preceding contextual stimulus that strongly influenced the tritone pair’s perception: either as an ascending or a descending step in pitch. We recorded single-unit responses from a population of auditory cortical cells in awake ferrets listening to the tritone pairs preceded by the contextual stimulus. We find that the responses adapt locally to the contextual stimulus, consistent with human MEG recordings from the auditory cortex under the same conditions. Decoding the population responses demonstrates that cells responding to pitch-changes are able to predict well the context-sensitive percept of the tritone pairs. Conversely, decoding the individual pitch representations and taking their distance in the circular Shepard tone space predicts the opposite of the percept. The various percepts can be readily captured and explained by a neural model of cortical activity based on populations of adapting, pitch and pitch-direction cells, aligned with the neurophysiological responses. Together, these decoding and model results suggest that contextual influences on perception may well be already encoded at the level of the primary sensory cortices, reflecting basic neural response properties commonly found in these areas.
-
- Neuroscience
Relatively little is known about the way vision is used to guide locomotion in the natural world. What visual features are used to choose paths in natural complex terrain? To answer this question, we measured eye and body movements while participants walked in natural outdoor environments. We incorporated measurements of the three-dimensional (3D) terrain structure into our analyses and reconstructed the terrain along the walker’s path, applying photogrammetry techniques to the eye tracker’s scene camera videos. Combining these reconstructions with the walker’s body movements, we demonstrate that walkers take terrain structure into account when selecting paths through an environment. We find that they change direction to avoid taking steeper steps that involve large height changes, instead of choosing more circuitous, relatively flat paths. Our data suggest walkers plan the location of individual footholds and plan ahead to select flatter paths. These results provide evidence that locomotor behavior in natural environments is controlled by decision mechanisms that account for multiple factors, including sensory and motor information, costs, and path planning.