During speech perception, if auditory speech is not informative the frontal cortex will enhance responses in visual regions that represent the mouth of the talker to improve perception.
Sleep-encoded vocabulary influences awake decision-making 36 hr later, particularly when targeting the vocabulary to slow-wave troughs, which suggests that unconscious episodic memory formation during deep sleep is possible.
Patrick J Karas, John F Magnotti ... Michael S Beauchamp
Human perception and brain responses differ between words, in which mouth movements are visible before the voice is heard, and words, for which the reverse is true.
The perception of ambiguous steps in relative tone height is predicted by direction-selective cells in the auditory cortex, rather than the brain's represented distance between the tone heights.
Distinct frequency-based interactions between cortical and subcortical regions highlight the cerebellum’s crucial role in speech production and perception.
An oscillating computational model combined with a predictive internal linguistic model can track naturally timed speech in which pseudo-rhythmicity is related to the predictability of words within their sentence context.
The juvenile barn owl displays wide heterogeneity in tuning properties in the auditory midbrain, which is shaped during development to match the pattern of sensory statistics experienced in early life and maintained into adulthood.