27 results found
    1. Neuroscience

    Lip movements entrain the observers’ low-frequency brain oscillations to facilitate speech intelligibility

    Hyojin Park et al.
    Rhythms of lip movements entrain low-frequency brain oscillations and facilitate audiovisual speech processing.
    1. Neuroscience

    Resolving multisensory and attentional influences across cortical depth in sensory cortices

    Remi Gau et al.
    The different laminar profiles observed across the cortical depth for multisensory and attentional influences indicate partly distinct neural circuitries of information-flow control.
    1. Neuroscience

    The visual speech head start improves perception and reduces superior temporal cortex responses to auditory speech

    Patrick J Karas et al.
    Human perception and brain responses differ between words, in which mouth movements are visible before the voice is heard, and words, for which the reverse is true.
    1. Neuroscience

    Frontal cortex selects representations of the talker’s mouth to aid in speech perception

    Muge Ozker et al.
    During speech perception, if auditory speech is not informative the frontal cortex will enhance responses in visual regions that represent the mouth of the talker to improve perception.
    1. Computational and Systems Biology
    2. Neuroscience

    Shared and modality-specific brain regions that mediate auditory and visual word comprehension

    Anne Keitel et al.
    The comprehension of acoustic and visual speech depends on modality-specific pathways in the brain, which explains why auditory speech abilities and lip reading are not associated in typical adults.
    1. Neuroscience

    Neurobehavioral evidence of interoceptive sensitivity in early infancy

    Lara Maister et al.
    Pre-verbal infants demonstrate an implicit sensitivity to interoceptive sensations, which fluctuates spontaneously during emotional processing and guides audiovisual preferences in the environment.
    1. Neuroscience

    Using the past to estimate sensory uncertainty

    Ulrik Beierholm et al.
    For perceptual inference, human observers do not estimate sensory uncertainty instantaneously from the current sensory signals alone, but by combining past and current sensory inputs consistent with a Bayesian learner.
    1. Neuroscience

    CAPS-1 promotes fusion competence of stationary dense-core vesicles in presynaptic terminals of mammalian neurons

    Margherita Farina et al.
    Stationary dense-core vesicles depend on the CAPS-1 protein to fuse with the presynaptic membrane.
    1. Neuroscience

    Contributions of local speech encoding and functional connectivity to audio-visual speech perception

    Bruno L Giordano et al.
    Seeing a speaker's face aids comprehension by facilitating functional connectivity between the temporal and frontal lobes.
    1. Neuroscience

    Integrating prediction errors at two time scales permits rapid recalibration of speech sound categories

    Itsaso Olasagasti, Anne-Lise Giraud
    Keeping flexible adaptable representations of speech categories at different time scales allows the brain to maintain stable perception in the face of varying speech sound characteristics.

Refine your results by:

Type
Research categories