1. Neuroscience
Download icon

High-resolution imaging of skin deformation shows that afferents from human fingertips signal slip onset

  1. Benoit P Delhaye  Is a corresponding author
  2. Ewa Jarocka
  3. Allan Barrea
  4. Jean-Louis Thonnard
  5. Benoni Edin
  6. Philippe Lefèvre
  1. Université catholique de Louvain, Belgium
  2. Umeå University, Sweden
Research Article
  • Cited 0
  • Views 421
  • Annotations
Cite this article as: eLife 2021;10:e64679 doi: 10.7554/eLife.64679
Voice your concerns about research culture and research communication: Have your say in our 7th annual survey.

Abstract

Human tactile afferents provide essential feedback for grasp stability during dexterous object manipulation. Interacting forces between an object and the fingers induce slip events that are thought to provide information about grasp stability. To gain insight into this phenomenon, we made a transparent surface slip against a fixed fingerpad while monitoring skin deformation at the contact. Using microneurography, we simultaneously recorded the activity of single tactile afferents innervating the fingertips. This unique combination allowed us to describe how afferents respond to slip events and to relate their responses to surface deformations taking place inside their receptive fields. We found that all afferents were sensitive to slip events, but FA-I afferents in particular faithfully encoded compressive strain rates resulting from those slips. Given the high density of FA-I afferents in fingerpads, they are well suited to detect incipient slips and to provide essential information for the control of grip force during manipulation.

Data availability

All the data used to create the figures in the manuscript are available for download following this permanent dropbox link:https://www.dropbox.com/sh/vhozyj03o401sud/AADGmJeXj4zAjL8RsSb5OInja?dl=0

Article and author information

Author details

  1. Benoit P Delhaye

    ICTEAM, Université catholique de Louvain, Louvain-la-Neuve, Belgium
    For correspondence
    delhayeben@gmail.com
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-3974-7921
  2. Ewa Jarocka

    Department of Integrative Medical Biology, Umeå University, Umeå, Sweden
    Competing interests
    The authors declare that no competing interests exist.
  3. Allan Barrea

    ICTEAM, Université catholique de Louvain, Louvain-la-Neuve, Belgium
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-1094-4596
  4. Jean-Louis Thonnard

    Institute of Neurosciences, Université catholique de Louvain, Brussels, Belgium
    Competing interests
    The authors declare that no competing interests exist.
  5. Benoni Edin

    Department of Integrative Medical Biology, Umeå University, Umeå, Sweden
    Competing interests
    The authors declare that no competing interests exist.
  6. Philippe Lefèvre

    ICTEAM; IoNS, Université catholique de Louvain, Louvain-la-Neuve, Belgium
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-2032-3635

Funding

European Space Agency (/)

  • Jean-Louis Thonnard
  • Philippe Lefèvre

PRODEX (/)

  • Jean-Louis Thonnard
  • Philippe Lefèvre

Swedish Research Council (VR 2016-01635)

  • Benoni Edin

Fonds De La Recherche Scientifique - FNRS (/)

  • Benoit P Delhaye

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Human subjects: Each subject provided written informed consent to the procedures, and the study was approved by the local ethics committee at the host institution (Institute of Neuroscience, Université catholique de Louvain, Brussels, Belgium).

Reviewing Editor

  1. Cornelius Schwarz

Publication history

  1. Received: November 6, 2020
  2. Accepted: April 13, 2021
  3. Accepted Manuscript published: April 22, 2021 (version 1)

Copyright

© 2021, Delhaye et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 421
    Page views
  • 72
    Downloads
  • 0
    Citations

Article citation count generated by polling the highest count across the following sources: Crossref, PubMed Central, Scopus.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Download citations (links to download the citations from this article in formats compatible with various reference manager tools)

Open citations (links to open the citations from this article in various online reference manager services)

Further reading

    1. Neuroscience
    Eun Ju Shin et al.
    Research Article Updated

    Studies in rats, monkeys, and humans have found action-value signals in multiple regions of the brain. These findings suggest that action-value signals encoded in these brain structures bias choices toward higher expected rewards. However, previous estimates of action-value signals might have been inflated by serial correlations in neural activity and also by activity related to other decision variables. Here, we applied several statistical tests based on permutation and surrogate data to analyze neural activity recorded from the striatum, frontal cortex, and hippocampus. The results show that previously identified action-value signals in these brain areas cannot be entirely accounted for by concurrent serial correlations in neural activity and action value. We also found that neural activity related to action value is intermixed with signals related to other decision variables. Our findings provide strong evidence for broadly distributed neural signals related to action value throughout the brain.

    1. Neuroscience
    Gonçalo Lopes et al.
    Research Article Updated

    Real-time rendering of closed-loop visual environments is important for next-generation understanding of brain function and behaviour, but is often prohibitively difficult for non-experts to implement and is limited to few laboratories worldwide. We developed BonVision as an easy-to-use open-source software for the display of virtual or augmented reality, as well as standard visual stimuli. BonVision has been tested on humans and mice, and is capable of supporting new experimental designs in other animal models of vision. As the architecture is based on the open-source Bonsai graphical programming language, BonVision benefits from native integration with experimental hardware. BonVision therefore enables easy implementation of closed-loop experiments, including real-time interaction with deep neural networks, and communication with behavioural and physiological measurement and manipulation devices.