17 results found
    1. Neuroscience

    Frontal cortex selects representations of the talker’s mouth to aid in speech perception

    Muge Ozker et al.
    During speech perception, if auditory speech is not informative the frontal cortex will enhance responses in visual regions that represent the mouth of the talker to improve perception.
    1. Neuroscience

    Language: I see what you are saying

    Gregory B Cogan
    Insight
    Available as:
    • HTML
    • PDF
    1. Neuroscience

    Push-pull competition between bottom-up and top-down auditory attention to natural soundscapes

    Nicholas Huang, Mounya Elhilali
    Everyday soundscapes dynamically engage attention towards target sounds or salient ambient events, with both attentional forms engaging the same fronto-parietal network but in a push-pull competition for limited neural resources.
    1. Neuroscience

    Altered functional connectivity during speech perception in congenital amusia

    Kyle Jasmin et al.
    Individuals with amusia, who have unreliable pitch processing, show decreased functional connectivity between right auditory and left language-related cortex during speech perception, demonstrating a neural basis for compensatory dimensional weighting.
    1. Neuroscience

    Integrating prediction errors at two time scales permits rapid recalibration of speech sound categories

    Itsaso Olasagasti, Anne-Lise Giraud
    Keeping flexible adaptable representations of speech categories at different time scales allows the brain to maintain stable perception in the face of varying speech sound characteristics.
    1. Neuroscience

    Resolving multisensory and attentional influences across cortical depth in sensory cortices

    Remi Gau et al.
    The different laminar profiles observed across the cortical depth for multisensory and attentional influences indicate partly distinct neural circuitries of information-flow control.
    1. Neuroscience

    Neuronal populations in the occipital cortex of the blind synchronize to the temporal dynamics of speech

    Markus Johannes Van Ackeren et al.
    Blind people re-purpose the brain's visual areas for tracking speech rhythm.
    1. Neuroscience

    Hyperalignment: Modeling shared information encoded in idiosyncratic cortical topographies

    James V Haxby et al.
    Hyperalignment provides a conceptual framework for cortical architecture that captures how shared information is encoded in idiosyncratic cortical topographies that preserve vector geometry for population response and connectivity patterns.
    1. Neuroscience

    Neural ensemble dynamics in dorsal motor cortex during speech in people with paralysis

    Sergey D Stavisky et al.
    Neurons in human dorsal motor cortex, an area involved in controlling arm and hand movements, are also active – and show similar ensemble dynamics – during speaking.
    1. Computational and Systems Biology
    2. Neuroscience

    Shared and modality-specific brain regions that mediate auditory and visual word comprehension

    Anne Keitel et al.
    The comprehension of acoustic and visual speech depends on modality-specific pathways in the brain, which explains why auditory speech abilities and lip reading are not associated in typical adults.

Refine your results by:

Type
Research categories