2,587 results found
    1. Neuroscience

    Contributions of local speech encoding and functional connectivity to audio-visual speech perception

    Bruno L Giordano et al.
    Seeing a speaker's face aids comprehension by facilitating functional connectivity between the temporal and frontal lobes.
    1. Computational and Systems Biology
    2. Neuroscience

    Shared and modality-specific brain regions that mediate auditory and visual word comprehension

    Anne Keitel et al.
    The comprehension of acoustic and visual speech depends on modality-specific pathways in the brain, which explains why auditory speech abilities and lip reading are not associated in typical adults.
    1. Neuroscience

    The visual speech head start improves perception and reduces superior temporal cortex responses to auditory speech

    Patrick J Karas et al.
    Human perception and brain responses differ between words, in which mouth movements are visible before the voice is heard, and words, for which the reverse is true.
    1. Neuroscience

    Effects of visual inputs on neural dynamics for coding of location and running speed in medial entorhinal cortex

    Holger Dannenberg et al.
    Spatial accuracy of grid cell firing correlates with the slope of the local field potential theta frequency vs. running speed relationship and integrates velocity signals over past time.
    1. Neuroscience

    Learning speed and detection sensitivity controlled by distinct cortico-fugal neurons in visual cortex

    Sarah Ruediger, Massimo Scanziani
    Neural pathways from mammalian visual cortex to ancestral brain structures control learning speed and performance of a simple detection task.
    1. Neuroscience

    Frontal cortex selects representations of the talker’s mouth to aid in speech perception

    Muge Ozker et al.
    During speech perception, if auditory speech is not informative the frontal cortex will enhance responses in visual regions that represent the mouth of the talker to improve perception.
    1. Neuroscience

    Neuronal populations in the occipital cortex of the blind synchronize to the temporal dynamics of speech

    Markus Johannes Van Ackeren et al.
    Blind people re-purpose the brain's visual areas for tracking speech rhythm.
    1. Computational and Systems Biology
    2. Ecology

    TRex, a fast multi-animal tracking system with markerless identification, and 2D estimation of posture and visual fields

    Tristan Walter, Iain D Couzin
    Being fast, memory efficient, easy to use, and with powerful integrated tools, TRex will lower barriers of entry into, and enable more ambitious approaches to, the quantitative study of behavior.
    1. Neuroscience

    Foggy perception slows us down

    Paolo Pretto et al.
    Virtual reality experiments show that motorists slow down when driving in fog, but they speed up when visibility is reduced equally at all distances.
    1. Neuroscience

    Songbirds can learn flexible contextual control over syllable sequencing

    Lena Veit et al.
    Songbirds can use arbitrary visual cues to immediately, flexibly and adaptively control syntax of learned song vocalizations in a manner that parallels human cognitive control over syllable sequencing in speech.

Refine your results by:

Type
Research categories