Functional brain alterations following mild-to-moderate sensorineural hearing loss in children
Abstract
Auditory deprivation in the form of deafness during development leads to lasting changes in central auditory system function. However, less is known about the effects of mild-to-moderate sensorineural hearing loss (MMHL) during development. Here, we used a longitudinal design to examine late auditory evoked responses and mismatch responses to nonspeech and speech sounds for children with MMHL. At Time 1, younger children with MMHL (8-12 years; n = 23) showed age-appropriate mismatch negativities (MMNs) to sounds, but older children (12-16 years; n = 23) did not. Six years later, we re-tested a subset of the younger (now older) children with MMHL (n = 13). Children who had shown significant MMNs at Time 1 showed MMNs that were reduced and, for nonspeech, absent at Time 2. Our findings demonstrate that even a mild-to-moderate hearing loss during early-to-mid childhood can lead to changes in the neural processing of sounds in late childhood/adolescence.
Data availability
Unidentifiable data, stimuli, and statistical analyses scripts are available on https://github.com/acalcus/MMHL.git
Article and author information
Author details
Funding
H2020 Marie Skłodowska-Curie Actions (FP7-607139)
- Axelle Calcus
ESRC National Centre for Research Methods, University of Southampton (RES-061-25-0440)
- Lorna F Halliday
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Ethics
Human subjects: Informed consent, and consent to publish was obtained from parents/guardians of the children included in this study. Ethical approval for this study was provided by the UCL Research Ethics Committee (Project ID number: 2109/004).
Copyright
© 2019, Calcus et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 3,824
- views
-
- 428
- downloads
-
- 14
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
Sensory signals from the body’s visceral organs (e.g. the heart) can robustly influence the perception of exteroceptive sensations. This interoceptive–exteroceptive interaction has been argued to underlie self-awareness by situating one’s perceptual awareness of exteroceptive stimuli in the context of one’s internal state, but studies probing cardiac influences on visual awareness have yielded conflicting findings. In this study, we presented separate grating stimuli to each of subjects’ eyes as in a classic binocular rivalry paradigm – measuring the duration for which each stimulus dominates in perception. However, we caused the gratings to ‘pulse’ at specific times relative to subjects’ real-time electrocardiogram, manipulating whether pulses occurred during cardiac systole, when baroreceptors signal to the brain that the heart has contracted, or in diastole when baroreceptors are silent. The influential ‘Baroreceptor Hypothesis’ predicts the effect of baroreceptive input on visual perception should be uniformly suppressive. In contrast, we observed that dominance durations increased for systole-entrained stimuli, inconsistent with the Baroreceptor Hypothesis. Furthermore, we show that this cardiac-dependent rivalry effect is preserved in subjects who are at-chance discriminating between systole-entrained and diastole-presented stimuli in a separate interoceptive awareness task, suggesting that our results are not dependent on conscious access to heartbeat sensations.
-
- Neuroscience
Decoding the activity of individual neural cells during natural behaviours allows neuroscientists to study how the nervous system generates and controls movements. Contrary to other neural cells, the activity of spinal motor neurons can be determined non-invasively (or minimally invasively) from the decomposition of electromyographic (EMG) signals into motor unit firing activities. For some interfacing and neuro-feedback investigations, EMG decomposition needs to be performed in real time. Here, we introduce an open-source software that performs real-time decoding of motor neurons using a blind-source separation approach for multichannel EMG signal processing. Separation vectors (motor unit filters) are optimised for each motor unit from baseline contractions and then re-applied in real time during test contractions. In this way, the firing activity of multiple motor neurons can be provided through different forms of visual feedback. We provide a complete framework with guidelines and examples of recordings to guide researchers who aim to study movement control at the motor neuron level. We first validated the software with synthetic EMG signals generated during a range of isometric contraction patterns. We then tested the software on data collected using either surface or intramuscular electrode arrays from five lower limb muscles (gastrocnemius lateralis and medialis, vastus lateralis and medialis, and tibialis anterior). We assessed how the muscle or variation of contraction intensity between the baseline contraction and the test contraction impacted the accuracy of the real-time decomposition. This open-source software provides a set of tools for neuroscientists to design experimental paradigms where participants can receive real-time feedback on the output of the spinal cord circuits.