Interval timing is a fundamental component of action, and is susceptible to motor-related temporal distortions. Previous studies have shown that concurrent movement biases temporal estimates, but have primarily considered self-modulated movement only. However, real-world encounters often include situations in which movement is restricted or perturbed by environmental factors. In the following experiments, we introduced viscous movement environments to externally modulate movement and investigated the resulting effects on temporal perception. In two separate tasks, participants timed auditory intervals while moving a robotic arm that randomly applied four levels of viscosity. Results demonstrated that higher viscosity led to shorter perceived durations. Using a drift-diffusion model and a Bayesian observer model, we confirmed these biasing effects arose from perceptual mechanisms, instead of biases in decision making. These findings suggest that environmental perturbations are an important factor in movement-related temporal distortions, and enhance the current understanding of the interactions of motor activity and cognitive processes.
All source data have been deposited in Dryad. Located at doi:10.25338/B8S913
Slowing the Body slows down Time (Perception)Dryad Digital Repository, doi:10.25338/B8S913.
- Martin Wiener
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Human subjects: Informed consent was obtained from all subjects. All protocols were approved by the Institutional Review Board at the University of California, Davis (IRB Protocol # 1336438-6)
- Hugo Merchant, National Autonomous University of Mexico, Mexico
© 2021, De Kock et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Neural activity in the mammalian cortex has been studied extensively during decision tasks, and recent work aims to identify under what conditions cortex is actually necessary for these tasks. We discovered that mice with distinct cognitive experiences, beyond sensory and motor learning, use different cortical areas and neural activity patterns to solve the same navigation decision task, revealing past learning as a critical determinant of whether cortex is necessary for goal-directed navigation. We used optogenetics and calcium imaging to study the necessity and neural activity of multiple cortical areas in mice with different training histories. Posterior parietal cortex and retrosplenial cortex were mostly dispensable for accurate performance of a simple navigation task. In contrast, these areas were essential for the same simple task when mice were previously trained on complex tasks with delay periods or association switches. Multiarea calcium imaging showed that, in mice with complex-task experience, single-neuron activity had higher selectivity and neuron–neuron correlations were weaker, leading to codes with higher task information. Therefore, past experience is a key factor in determining whether cortical areas have a causal role in goal-directed navigation.
During flight maneuvers, insects exhibit compensatory head movements which are essential for stabilizing the visual field on their retina, reducing motion blur, and supporting visual self-motion estimation. In Diptera, such head movements are mediated via visual feedback from their compound eyes that detect retinal slip, as well as rapid mechanosensory feedback from their halteres – the modified hindwings that sense the angular rates of body rotations. Because non-Dipteran insects lack halteres, it is not known if mechanosensory feedback about body rotations plays any role in their head stabilization response. Diverse non-Dipteran insects are known to rely on visual and antennal mechanosensory feedback for flight control. In hawkmoths, for instance, reduction of antennal mechanosensory feedback severely compromises their ability to control flight. Similarly, when the head movements of freely flying moths are restricted, their flight ability is also severely impaired. The role of compensatory head movements as well as multimodal feedback in insect flight raises an interesting question: in insects that lack halteres, what sensory cues are required for head stabilization? Here, we show that in the nocturnal hawkmoth Daphnis nerii, compensatory head movements are mediated by combined visual and antennal mechanosensory feedback. We subjected tethered moths to open-loop body roll rotations under different lighting conditions, and measured their ability to maintain head angle in the presence or absence of antennal mechanosensory feedback. Our study suggests that head stabilization in moths is mediated primarily by visual feedback during roll movements at lower frequencies, whereas antennal mechanosensory feedback is required when roll occurs at higher frequency. These findings are consistent with the hypothesis that control of head angle results from a multimodal feedback loop that integrates both visual and antennal mechanosensory feedback, albeit at different latencies. At adequate light levels, visual feedback is sufficient for head stabilization primarily at low frequencies of body roll. However, under dark conditions, antennal mechanosensory feedback is essential for the control of head movements at high frequencies of body roll.