Rapid stimulus-driven modulation of slow ocular position drifts

  1. Tatiana Malevich
  2. Antimo Buonocore
  3. Ziad M Hafed  Is a corresponding author
  1. Tuebingen University, Germany

Abstract

The eyes are never still during maintained gaze fixation. When microsaccades are not occurring, ocular position exhibits continuous slow changes, often referred to as drifts. Unlike microsaccades, drifts remain to be viewed as largely random eye movements. Here we found that ocular position drifts can, instead, be very systematically stimulus-driven, and with very short latencies. We used highly precise eye tracking in three well trained macaque monkeys and found that even fleeting (~8 ms duration) stimulus presentations can robustly trigger transient and stimulus-specific modulations of ocular position drifts, and with only approximately 60 ms latency. Such drift responses are binocular, and they are most effectively elicited with large stimuli of low spatial frequency. Intriguingly, the drift responses exhibit some image pattern selectivity, and they are not explained by convergence responses, pupil constrictions, head movements, or starting eye positions. Ocular position drifts have very rapid access to exogenous visual information.

Data availability

All data generated or analyzed during this study are included in the manuscript and supporting files. Source data for figures will be uploaded upon acceptance.

Article and author information

Author details

  1. Tatiana Malevich

    Werner Reichardt Centre for Integrative Neuroscience, Tuebingen University, Tuebingen, Germany
    Competing interests
    The authors declare that no competing interests exist.
  2. Antimo Buonocore

    Werner Reichardt Centre for Integrative Neuroscience, Tuebingen University, Tuebingen, Germany
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-3917-510X
  3. Ziad M Hafed

    Werner Reichardt Centre for Integrative Neuroscience, Tuebingen University, Tuebingen, Germany
    For correspondence
    ziad.m.hafed@cin.uni-tuebingen.de
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-9968-119X

Funding

Deutsche Forschungsgemeinschaft (HA6749/2-1)

  • Antimo Buonocore
  • Ziad M Hafed

Deutsche Forschungsgemeinschaft (EXC307)

  • Tatiana Malevich
  • Antimo Buonocore
  • Ziad M Hafed

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Animal experimentation: We tracked eye movements in 3 male rhesus macaque monkeys trained on behavioral eye movement tasks under head-stabilized conditions. The experiments were part of a larger neurophysiological investigation in the laboratory. All procedures and behavioral paradigms were approved (CIN3/13 and CIN4/19G) by ethics committees at the Regierungspräsidium Tübingen, and they complied with European Union directives on animal research.

Reviewing Editor

  1. Emilio Salinas, Wake Forest School of Medicine, United States

Version history

  1. Received: April 5, 2020
  2. Accepted: August 5, 2020
  3. Accepted Manuscript published: August 6, 2020 (version 1)
  4. Version of Record published: August 21, 2020 (version 2)

Copyright

© 2020, Malevich et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 1,284
    Page views
  • 184
    Downloads
  • 15
    Citations

Article citation count generated by polling the highest count across the following sources: Crossref, PubMed Central, Scopus.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Tatiana Malevich
  2. Antimo Buonocore
  3. Ziad M Hafed
(2020)
Rapid stimulus-driven modulation of slow ocular position drifts
eLife 9:e57595.
https://doi.org/10.7554/eLife.57595

Share this article

https://doi.org/10.7554/eLife.57595

Further reading

    1. Neuroscience
    Peibo Xu, Jian Peng ... Yuejun Chen
    Research Article

    Deciphering patterns of connectivity between neurons in the brain is a critical step toward understanding brain function. Imaging-based neuroanatomical tracing identifies area-to-area or sparse neuron-to-neuron connectivity patterns, but with limited throughput. Barcode-based connectomics maps large numbers of single-neuron projections, but remains a challenge for jointly analyzing single-cell transcriptomics. Here, we established a rAAV2-retro barcode-based multiplexed tracing method that simultaneously characterizes the projectome and transcriptome at the single neuron level. We uncovered dedicated and collateral projection patterns of ventromedial prefrontal cortex (vmPFC) neurons to five downstream targets and found that projection-defined vmPFC neurons are molecularly heterogeneous. We identified transcriptional signatures of projection-specific vmPFC neurons, and verified Pou3f1 as a marker gene enriched in neurons projecting to the lateral hypothalamus, denoting a distinct subset with collateral projections to both dorsomedial striatum and lateral hypothalamus. In summary, we have developed a new multiplexed technique whose paired connectome and gene expression data can help reveal organizational principles that form neural circuits and process information.

    1. Neuroscience
    Maureen van der Grinten, Jaap de Ruyter van Steveninck ... Yağmur Güçlütürk
    Tools and Resources

    Blindness affects millions of people around the world. A promising solution to restoring a form of vision for some individuals are cortical visual prostheses, which bypass part of the impaired visual pathway by converting camera input to electrical stimulation of the visual system. The artificially induced visual percept (a pattern of localized light flashes, or ‘phosphenes’) has limited resolution, and a great portion of the field’s research is devoted to optimizing the efficacy, efficiency, and practical usefulness of the encoding of visual information. A commonly exploited method is non-invasive functional evaluation in sighted subjects or with computational models by using simulated prosthetic vision (SPV) pipelines. An important challenge in this approach is to balance enhanced perceptual realism, biologically plausibility, and real-time performance in the simulation of cortical prosthetic vision. We present a biologically plausible, PyTorch-based phosphene simulator that can run in real-time and uses differentiable operations to allow for gradient-based computational optimization of phosphene encoding models. The simulator integrates a wide range of clinical results with neurophysiological evidence in humans and non-human primates. The pipeline includes a model of the retinotopic organization and cortical magnification of the visual cortex. Moreover, the quantitative effects of stimulation parameters and temporal dynamics on phosphene characteristics are incorporated. Our results demonstrate the simulator’s suitability for both computational applications such as end-to-end deep learning-based prosthetic vision optimization as well as behavioral experiments. The modular and open-source software provides a flexible simulation framework for computational, clinical, and behavioral neuroscientists working on visual neuroprosthetics.