Voluntary and involuntary contributions to perceptually guided saccadic choices resolved with millisecond precision
Abstract
In the antisaccade task, which is considered a sensitive assay of cognitive 1 function, a salient visual cue appears and the participant must look away from it. This requires sensory, motor-planning, and cognitive neural mechanisms, but what are their unique contributions to performance, and when exactly are they engaged? Here, by manipulating task urgency, we generate a psychophysical curve that tracks the evolution of the saccadic choice process with millisec ond precision, and resolve the distinct contributions of reflexive (exogenous) and voluntary (endogenous) perceptual mechanisms to antisaccade performance over time. Both progress extremely rapidly, the former driving the eyes toward the cue early on (∼100 ms after cue onset) and the latter directing them away from the cue ∼40 ms later. The behavioral and modeling results provide a detailed, dynamical characterization of attentional and oculomotor capture that is not only qualitatively consistent across participants, but also indicative of their individual perceptual capacities.
Data availability
The psychophysical data are provided in a supplementary data file (Source Data 1). Matlab scripts for running the model are provided in a supplementary source code file (Source code 1).
Article and author information
Author details
Funding
National Eye Institute (R01EY025172)
- Emilio Salinas
- Terrence R Stanford
National Institute of Neurological Disorders and Stroke (T32NS073553-01)
- Christopher K Hauser
National Science Foundation (Graduate research fellowship)
- Christopher K Hauser
Tab Williams Family Endowment
- Emilio Salinas
- Terrence R Stanford
National Eye Institute (R01EY021228)
- Emilio Salinas
- Terrence R Stanford
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Ethics
Human subjects: All participants provided informed written consent before the experiment. All experimental procedures were conducted with the approval of the Institutional Review Board (IRB) of Wake Forest School of Medicine.
Copyright
© 2019, Salinas et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 1,667
- views
-
- 259
- downloads
-
- 22
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
Intelligent behavior requires to act directed by goals despite competing action tendencies triggered by stimuli in the environment. For eye movements, it has recently been discovered that this ability is briefly reduced in urgent situations (Salinas et al., 2019). In a time-window before an urgent response, participants could not help but look at a suddenly appearing visual stimulus, even though their goal was to look away from it. Urgency seemed to provoke a new visual–oculomotor phenomenon: A period in which saccadic eye movements are dominated by external stimuli, and uncontrollable by current goals. This period was assumed to arise from brain mechanisms controlling eye movements and spatial attention, such as those of the frontal eye field. Here, we show that the phenomenon is more general than previously thought. We found that also in well-investigated manual tasks, urgency made goal-conflicting stimulus features dominate behavioral responses. This dominance of behavior followed established trial-to-trial signatures of cognitive control mechanisms that replicate across a variety of tasks. Thus together, these findings reveal that urgency temporarily forces stimulus-driven action by overcoming cognitive control in general, not only at brain mechanisms controlling eye movements.
-
- Neuroscience
Locomotion in mammals is directly controlled by the spinal neuronal network, operating under the control of supraspinal signals and somatosensory feedback that interact with each other. However, the functional architecture of the spinal locomotor network, its operation regimes, and the role of supraspinal and sensory feedback in different locomotor behaviors, including at different speeds, remain unclear. We developed a computational model of spinal locomotor circuits receiving supraspinal drives and limb sensory feedback that could reproduce multiple experimental data obtained in intact and spinal-transected cats during tied-belt and split-belt treadmill locomotion. We provide evidence that the spinal locomotor network operates in different regimes depending on locomotor speed. In an intact system, at slow speeds (<0.4 m/s), the spinal network operates in a non-oscillating state-machine regime and requires sensory feedback or external inputs for phase transitions. Removing sensory feedback related to limb extension prevents locomotor oscillations at slow speeds. With increasing speed and supraspinal drives, the spinal network switches to a flexor-driven oscillatory regime and then to a classical half-center regime. Following spinal transection, the model predicts that the spinal network can only operate in the state-machine regime. Our results suggest that the spinal network operates in different regimes for slow exploratory and fast escape locomotor behaviors, making use of different control mechanisms.