Active tactile discrimination is coupled with and modulated by the cardiac cycle
Abstract
Perception and cognition are modulated by the phase of the cardiac signal in which the stimuli are presented. This has been shown by locking the presentation of stimuli to distinct cardiac phases. However, in everyday life sensory information is not presented in this passive and phase-locked manner, instead we actively move and control our sensors to perceive the world. Whether active sensing is coupled and modulated with the cardiac cycle remains largely unknown. Here we recorded the electrocardiograms of human participants while they actively performed a tactile grating orientation task. We show that the duration of subjects' touch varied as a function of the cardiac phase in which they initiated it. Touches initiated in the systole phase were held for longer periods of time than touches initiated in the diastole phase. This effect was most pronounced when elongating the duration of the touches to sense the most difficult gratings. Conversely, while touches in the control condition were coupled to the cardiac cycle, their length did not vary as a function of the phase in which these were initiated. Our results reveal that we actively spend more time sensing during systole periods, the cardiac phase associated with lower perceptual sensitivity (vs. diastole). In line with interoceptive inference accounts, these results indicate that we actively adjust the acquisition of sense data to our internal bodily cycles.
Data availability
All data generated or analysed during this study are included in the manuscript and supporting file; Source Data files have been provided for all figures in OSF repository: https://osf.io/d7x3g/
Article and author information
Author details
Funding
Leverhulme Trust (RPG-2016-120)
- James Kilner
Autonomus Community of the Balearic Islands, Postdoctoral Grant Margalida Comas (PD/036/2019)
- Alejandro Galvez-Pol
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Ethics
Human subjects: All participants volunteered to take part in the experiment, gave informed consent, and were reimbursed for the participation. Ethical approval for the study was obtained from the University College London research ethics committee ID 10857/002
Copyright
© 2022, Galvez-Pol et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 1,924
- views
-
- 383
- downloads
-
- 30
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
Most visual tasks involve looking for specific object features. But we also often perform property-based tasks where we look for specific property in an image, such as finding an odd item, deciding if two items are same, or if an object has symmetry. How do we solve such tasks? These tasks do not fit into standard models of decision making because their underlying feature space and decision process is unclear. Using well-known principles governing multiple object representations, we show that displays with repeating elements can be distinguished from heterogeneous displays using a property we define as visual homogeneity. In behavior, visual homogeneity predicted response times on visual search, same-different and symmetry tasks. Brain imaging during visual search and symmetry tasks revealed that visual homogeneity was localized to a region in the object-selective cortex. Thus, property-based visual tasks are solved in a localized region in the brain by computing visual homogeneity.
-
- Neuroscience
Electrophysiology has proven invaluable to record neural activity, and the development of Neuropixels probes dramatically increased the number of recorded neurons. These probes are often implanted acutely, but acute recordings cannot be performed in freely moving animals and the recorded neurons cannot be tracked across days. To study key behaviors such as navigation, learning, and memory formation, the probes must be implanted chronically. An ideal chronic implant should (1) allow stable recordings of neurons for weeks; (2) allow reuse of the probes after explantation; (3) be light enough for use in mice. Here, we present the ‘Apollo Implant’, an open-source and editable device that meets these criteria and accommodates up to two Neuropixels 1.0 or 2.0 probes. The implant comprises a ‘payload’ module which is attached to the probe and is recoverable, and a ‘docking’ module which is cemented to the skull. The design is adjustable, making it easy to change the distance between probes, the angle of insertion, and the depth of insertion. We tested the implant across eight labs in head-fixed mice, freely moving mice, and freely moving rats. The number of neurons recorded across days was stable, even after repeated implantations of the same probe. The Apollo implant provides an inexpensive, lightweight, and flexible solution for reusable chronic Neuropixels recordings.