Mechanosensory neurons control the timing of spinal microcircuit selection during locomotion
Despite numerous physiological studies about reflexes in the spinal cord, the contribution of mechanosensory feedback to active locomotion and the nature of underlying spinal circuits remains elusive. Here we investigate how mechanosensory feedback shapes active locomotion in a genetic model organism exhibiting simple locomotion—the zebrafish larva. We show that mechanosensory feedback enhances the recruitment of motor pools during active locomotion. Furthermore, we demonstrate that inputs from mechanosensory neurons increase locomotor speed by prolonging fast swimming at the expense of slow swimming during stereotyped acoustic escape responses. This effect could be mediated by distinct mechanosensory neurons. In the spinal cord, we show that connections compatible with monosynaptic inputs from mechanosensory Rohon-Beard neurons onto ipsilateral V2a interneurons selectively recruited at high speed can contribute to the observed enhancement of speed. Altogether, our study reveals the basic principles and a circuit diagram enabling speed modulation by mechanosensory feedback in the vertebrate spinal cord.
Article and author information
European Research Council (311673)
- Claire Wyart
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Animal experimentation: All procedures were approved by the Institutional Ethics Committee at the Institut du Cerveau et de la Moelle épinière (ICM), Paris, France, the Ethical Committee Charles Darwin and received subsequent approval from the EEC (2010/63/EU).
- Ronald L Calabrese, Emory University, United States
- Received: January 19, 2017
- Accepted: June 17, 2017
- Accepted Manuscript published: June 17, 2017 (version 1)
- Accepted Manuscript updated: June 19, 2017 (version 2)
- Version of Record published: July 6, 2017 (version 3)
© 2017, Wyart et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
- Page views
Article citation count generated by polling the highest count across the following sources: Crossref, Scopus, PubMed Central.
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Understanding object representations requires a broad, comprehensive sampling of the objects in our visual world with dense measurements of brain activity and behavior. Here, we present THINGS-data, a multimodal collection of large-scale neuroimaging and behavioral datasets in humans, comprising densely sampled functional MRI and magnetoencephalographic recordings, as well as 4.70 million similarity judgments in response to thousands of photographic images for up to 1,854 object concepts. THINGS-data is unique in its breadth of richly annotated objects, allowing for testing countless hypotheses at scale while assessing the reproducibility of previous findings. Beyond the unique insights promised by each individual dataset, the multimodality of THINGS-data allows combining datasets for a much broader view into object processing than previously possible. Our analyses demonstrate the high quality of the datasets and provide five examples of hypothesis-driven and data-driven applications. THINGS-data constitutes the core public release of the THINGS initiative (https://things-initiative.org) for bridging the gap between disciplines and the advancement of cognitive neuroscience.
Sensory neurons previously shown to optimize speed and balance in fish by providing information about the curvature of the spine show similar morphology and connectivity in mice.