Scanned optogenetic control of mammalian somatosensory input to map input-specific behavioral outputs

  1. Ara Schorscher-Petcu
  2. Flóra Takács
  3. Liam E Browne  Is a corresponding author
  1. University College London, United Kingdom

Abstract

Somatosensory stimuli guide and shape behavior, from immediate protective reflexes to longer-term learning and higher-order processes related to pain and touch. However, somatosensory inputs are challenging to control in awake mammals due to the diversity and nature of contact stimuli. Application of cutaneous stimuli is currently limited to relatively imprecise methods as well as subjective behavioral measures. The strategy we present here overcomes these difficulties, achieving 'remote touch' with spatiotemporally precise and dynamic optogenetic stimulation by projecting light to a small defined area of skin. We mapped behavioral responses in freely behaving mice with specific nociceptor and low-threshold mechanoreceptor inputs. In nociceptors, sparse recruitment of single action potentials shapes rapid protective pain-related behaviors, including coordinated head orientation and body repositioning that depend on the initial body pose. In contrast, activation of low-threshold mechanoreceptors elicited slow-onset behaviors and more subtle whole-body behaviors. The strategy can be used to define specific behavioral repertoires, examine the timing and nature of reflexes, and dissect sensory, motor, cognitive and motivational processes guiding behavior.

Data availability

All components necessary to assemble the optical system are listed in Figure 1 - table 1. A Solidworks assembly, the optical system control and acquisition software and behavioral analysis toolkit are available at https://github.com/browne-lab/throwinglight. The data that support the findings of this study are provided as source data files.

Article and author information

Author details

  1. Ara Schorscher-Petcu

    Wolfson Institute for Biomedical Research, University College London, London, United Kingdom
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-5808-5172
  2. Flóra Takács

    Wolfson Institute for Biomedical Research, University College London, London, United Kingdom
    Competing interests
    The authors declare that no competing interests exist.
  3. Liam E Browne

    Wolfson Institute for Biomedical Research, University College London, London, United Kingdom
    For correspondence
    liam.browne@ucl.ac.uk
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-5693-7703

Funding

Wellcome Trust (109372/Z/15/Z)

  • Liam E Browne

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Animal experimentation: All animal procedures were approved by University College London ethical review committees and conformed to UK Home Office regulations.

Copyright

© 2021, Schorscher-Petcu et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 2,345
    views
  • 311
    downloads
  • 6
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Ara Schorscher-Petcu
  2. Flóra Takács
  3. Liam E Browne
(2021)
Scanned optogenetic control of mammalian somatosensory input to map input-specific behavioral outputs
eLife 10:e62026.
https://doi.org/10.7554/eLife.62026

Share this article

https://doi.org/10.7554/eLife.62026

Further reading

    1. Neuroscience
    Maëliss Jallais, Marco Palombo
    Research Article

    This work proposes µGUIDE: a general Bayesian framework to estimate posterior distributions of tissue microstructure parameters from any given biophysical model or signal representation, with exemplar demonstration in diffusion-weighted magnetic resonance imaging. Harnessing a new deep learning architecture for automatic signal feature selection combined with simulation-based inference and efficient sampling of the posterior distributions, µGUIDE bypasses the high computational and time cost of conventional Bayesian approaches and does not rely on acquisition constraints to define model-specific summary statistics. The obtained posterior distributions allow to highlight degeneracies present in the model definition and quantify the uncertainty and ambiguity of the estimated parameters.

    1. Neuroscience
    Bharath Krishnan, Noah Cowan
    Insight

    Mice can generate a cognitive map of an environment based on self-motion signals when there is a fixed association between their starting point and the location of their goal.