Dynamics of gaze control during prey capture in freely moving mice
Abstract
Many studies of visual processing are conducted in constrained conditions such as head- and gaze-fixation, and therefore less is known about how animals actively acquire visual information in natural contexts. To determine how mice target their gaze during natural behavior, we measured head and bilateral eye movements in mice performing prey capture, an ethological behavior that engages vision. We found that the majority of eye movements are compensatory for head movements, thereby serving to stabilize the visual scene. During movement, however, periods of stabilization are interspersed with non-compensatory saccades that abruptly shift gaze position. Notably, these saccades do not preferentially target the prey location. Rather, orienting movements are driven by the head, with the eyes following in coordination to sequentially stabilize and recenter the gaze. These findings relate eye movements in the mouse to other species, and provide a foundation for studying active vision during ethological behaviors in the mouse.
Data availability
Behavioral data has been submitted to Dryad with DOI doi:10.5061/dryad.8cz8w9gmw
-
Date from: Dynamics of gaze control during prey capture in freely moving miceDryad Digital Repository, 10.5061/dryad.8cz8w9gmw.
Article and author information
Author details
Funding
National Institutes of Health (R34NS111669)
- Cristopher M Niell
University of Oregon (Promising Scholar Award)
- Angie M Michaiel
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Ethics
Animal experimentation: All procedures were conducted in accordance with the guidelines of the National Institutes of Health and were approved by the University of Oregon Institutional Animal Care and Use Committee (Protocol number: 17-27).
Reviewing Editor
- Miriam Spering, The University of British Columbia, Canada
Version history
- Received: April 1, 2020
- Accepted: July 23, 2020
- Accepted Manuscript published: July 24, 2020 (version 1)
- Version of Record published: August 19, 2020 (version 2)
Copyright
© 2020, Michaiel et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 5,906
- Page views
-
- 746
- Downloads
-
- 45
- Citations
Article citation count generated by polling the highest count across the following sources: Scopus, Crossref, PubMed Central.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
How does the human brain combine information across the eyes? It has been known for many years that cortical normalization mechanisms implement ‘ocularity invariance’: equalizing neural responses to spatial patterns presented either monocularly or binocularly. Here, we used a novel combination of electrophysiology, psychophysics, pupillometry, and computational modeling to ask whether this invariance also holds for flickering luminance stimuli with no spatial contrast. We find dramatic violations of ocularity invariance for these stimuli, both in the cortex and also in the subcortical pathways that govern pupil diameter. Specifically, we find substantial binocular facilitation in both pathways with the effect being strongest in the cortex. Near-linear binocular additivity (instead of ocularity invariance) was also found using a perceptual luminance matching task. Ocularity invariance is, therefore, not a ubiquitous feature of visual processing, and the brain appears to repurpose a generic normalization algorithm for different visual functions by adjusting the amount of interocular suppression.
-
- Neuroscience
Tastes typically evoke innate behavioral responses that can be broadly categorized as acceptance or rejection. However, research in Drosophila melanogaster indicates that taste responses also exhibit plasticity through experience-dependent changes in mushroom body circuits. In this study, we develop a novel taste learning paradigm using closed-loop optogenetics. We find that appetitive and aversive taste memories can be formed by pairing gustatory stimuli with optogenetic activation of sensory neurons or dopaminergic neurons encoding reward or punishment. As with olfactory memories, distinct dopaminergic subpopulations drive the parallel formation of short- and long-term appetitive memories. Long-term memories are protein synthesis-dependent and have energetic requirements that are satisfied by a variety of caloric food sources or by direct stimulation of MB-MP1 dopaminergic neurons. Our paradigm affords new opportunities to probe plasticity mechanisms within the taste system and understand the extent to which taste responses depend on experience.