Integration of locomotion and auditory signals in the mouse inferior colliculus
Abstract
The inferior colliculus (IC) is the major midbrain auditory integration center, where virtually all ascending auditory inputs converge. Although the IC has been extensively studied for sound processing, little is known about the neural activity of the IC in moving subjects, as frequently happens in natural hearing conditions. Here, by recording neural activity in walking mice, we show that the activity of IC neurons is strongly modulated by locomotion, even in the absence of sound stimuli. Similar modulation was also found in hearing-impaired mice, demonstrating that IC neurons receive non-auditory, locomotion-related neural signals. Sound-evoked activity was attenuated during locomotion, and this attenuation increased frequency selectivity across the neuronal population, while maintaining preferred frequencies. Our results suggest that during behavior, integrating movement-related and auditory information is an essential aspect of sound processing in the IC.
Data availability
All data generated or analysed during this study are included in the manuscript and supporting files. Source data files have been provided for Figures 1 through 4.
Article and author information
Author details
Funding
Institute for Basic Science (IBS-R015-D1)
- Gunsoo Kim
Institute for Basic Science (IBS-R015-D1)
- Joonyeol Lee
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Ethics
Animal experimentation: This study was performed in strict accordance with the recommendations in the Guide for the Care and Use of Laboratory Animals of the National Institutes of Health. All of the animals were handled according to the protocol (SKKUIACUC2018-02-09-1) approved by the institutional animal care and use committee (IACUC) of the Sungkyunkwan University. Surgeries were performed under isofluorane or ketamine/xylazine anesthesia, and every effort was made to minimize suffering.
Copyright
© 2020, Yang et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 3,771
- views
-
- 519
- downloads
-
- 47
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
When observing others’ behaviors, we continuously integrate their movements with the corresponding sounds to enhance perception and develop adaptive responses. However, how the human brain integrates these complex audiovisual cues based on their natural temporal correspondence remains unclear. Using electroencephalogram (EEG), we demonstrated that rhythmic cortical activity tracked the hierarchical rhythmic structures in audiovisually congruent human walking movements and footstep sounds. Remarkably, the cortical tracking effects exhibit distinct multisensory integration modes at two temporal scales: an additive mode in a lower-order, narrower temporal integration window (step cycle) and a super-additive enhancement in a higher-order, broader temporal window (gait cycle). Furthermore, while neural responses at the lower-order timescale reflect a domain-general audiovisual integration process, cortical tracking at the higher-order timescale is exclusively engaged in the integration of biological motion cues. In addition, only this higher-order, domain-specific cortical tracking effect correlates with individuals’ autistic traits, highlighting its potential as a neural marker for autism spectrum disorder. These findings unveil the multifaceted mechanism whereby rhythmic cortical activity supports the multisensory integration of human motion, shedding light on how neural coding of hierarchical temporal structures orchestrates the processing of complex, natural stimuli across multiple timescales.
-
- Neuroscience
Early-life stress can have lifelong consequences, enhancing stress susceptibility and resulting in behavioural and cognitive deficits. While the effects of early-life stress on neuronal function have been well-described, we still know very little about the contribution of non-neuronal brain cells. Investigating the complex interactions between distinct brain cell types is critical to fully understand how cellular changes manifest as behavioural deficits following early-life stress. Here, using male and female mice we report that early-life stress induces anxiety-like behaviour and fear generalisation in an amygdala-dependent learning and memory task. These behavioural changes were associated with impaired synaptic plasticity, increased neural excitability, and astrocyte hypofunction. Genetic perturbation of amygdala astrocyte function by either reducing astrocyte calcium activity or reducing astrocyte network function was sufficient to replicate cellular, synaptic, and fear memory generalisation associated with early-life stress. Our data reveal a role of astrocytes in tuning emotionally salient memory and provide mechanistic links between early-life stress, astrocyte hypofunction, and behavioural deficits.