The different laminar profiles observed across the cortical depth for multisensory and attentional influences indicate partly distinct neural circuitries of information-flow control.
Human perception and brain responses differ between words, in which mouth movements are visible before the voice is heard, and words, for which the reverse is true.
During speech perception, if auditory speech is not informative the frontal cortex will enhance responses in visual regions that represent the mouth of the talker to improve perception.
The comprehension of acoustic and visual speech depends on modality-specific pathways in the brain, which explains why auditory speech abilities and lip reading are not associated in typical adults.
Pre-verbal infants demonstrate an implicit sensitivity to interoceptive sensations, which fluctuates spontaneously during emotional processing and guides audiovisual preferences in the environment.
For perceptual inference, human observers do not estimate sensory uncertainty instantaneously from the current sensory signals alone, but by combining past and current sensory inputs consistent with a Bayesian learner.
Keeping flexible adaptable representations of speech categories at different time scales allows the brain to maintain stable perception in the face of varying speech sound characteristics.