High-fidelity musculoskeletal modeling reveals a motor planning contribution to the speed-accuracy tradeoff
Abstract
A long-standing challenge in motor neuroscience is to understand the relationship between movement speed and accuracy, known as the speed-accuracy tradeoff. Here, we introduce a biomechanically realistic computational model of three-dimensional upper extremity movements that reproduces well-known features of reaching movements. This model revealed that the speed-accuracy tradeoff, as described by Fitts' law, emerges even without the presence of motor noise, which is commonly believed to underlie the speed-accuracy tradeoff. Next, we analyzed motor cortical neural activity from monkeys reaching to targets of different sizes. We found that the contribution of preparatory neural activity to movement duration variability is greater for smaller targets than larger targets, and that movements to smaller targets exhibit less variability in population-level preparatory activity, but greater movement duration variability. These results propose a new theory underlying the speed-accuracy tradeoff: Fitts' law emerges from greater task demands constraining the optimization landscape in a fashion that reduces the number of 'good' control solutions (i.e., faster reaches). Thus, contrary to current beliefs, the speed-accuracy tradeoff could be a consequence of motor planning variability and not exclusively signal-dependent noise.
Data availability
The source code for the computer simulations and our data are available at https://simtk.org/projects/ue-reaching. Users must first create a free account (https://simtk.org/account/register.php) before they can download the datasets from the site.
Article and author information
Author details
Funding
National Institutes of Health (U54EB020405)
- Scott L Delp
National Institute of Neurological Disorders and Stroke (R01NS076460)
- Krishna V Shenoy
National Institute of Mental Health (R01MH09964703)
- Krishna V Shenoy
Defense Advanced Research Projects Agency (N66001-10-C-2010)
- Krishna V Shenoy
National Institutes of Health (8DP1HD075623)
- Krishna V Shenoy
Simons Foundation (325380 and 543045)
- Krishna V Shenoy
National Institutes of Health (5F31NS103409-02)
- Saurabh Vyas
National Science Foundation (Graduate Fellowship)
- Saurabh Vyas
Stanford University (Ric Weiland Stanford Graduate Fellowship)
- Saurabh Vyas
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Ethics
Animal experimentation: All surgical and animal care procedures were performed in accordance with NationalInstitutes of Health guidelines and were approved by the Stanford University InstitutionalAnimal Care and Use Committee (8856).
Human subjects: Subjects gave written informed consent, and consent to publish, approved by the Stanford University Institutional Review Board (42787). The guidelines followed are specified in the Human Research Protection Program (HRPP Stanford University).
Copyright
© 2020, Al Borno et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 1,949
- views
-
- 248
- downloads
-
- 10
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
When observing others’ behaviors, we continuously integrate their movements with the corresponding sounds to enhance perception and develop adaptive responses. However, how the human brain integrates these complex audiovisual cues based on their natural temporal correspondence remains unclear. Using electroencephalogram (EEG), we demonstrated that rhythmic cortical activity tracked the hierarchical rhythmic structures in audiovisually congruent human walking movements and footstep sounds. Remarkably, the cortical tracking effects exhibit distinct multisensory integration modes at two temporal scales: an additive mode in a lower-order, narrower temporal integration window (step cycle) and a super-additive enhancement in a higher-order, broader temporal window (gait cycle). Furthermore, while neural responses at the lower-order timescale reflect a domain-general audiovisual integration process, cortical tracking at the higher-order timescale is exclusively engaged in the integration of biological motion cues. In addition, only this higher-order, domain-specific cortical tracking effect correlates with individuals’ autistic traits, highlighting its potential as a neural marker for autism spectrum disorder. These findings unveil the multifaceted mechanism whereby rhythmic cortical activity supports the multisensory integration of human motion, shedding light on how neural coding of hierarchical temporal structures orchestrates the processing of complex, natural stimuli across multiple timescales.
-
- Evolutionary Biology
- Neuroscience
The first complete 3D reconstruction of the compound eye of a minute wasp species sheds light on the nuts and bolts of size reduction.