A series of experiments suggest that the fronto-parietal attentional network is involved in controlling eye-based attention, and FEF plays a crucial causal role in generating the attention-induced ocular dominance shift.
Jacob A Westerberg, Michelle S Schall ... Jeffrey D Schall
Simultaneous sampling of electrical voltages outside the brain with neural signals in the cerebral cortex reveals how electrical currents in mosaics of cortical columns produce an electrical signal that can be measured noninvasively to assess the allocation of attention.
During speech perception, if auditory speech is not informative the frontal cortex will enhance responses in visual regions that represent the mouth of the talker to improve perception.
Lucas Y Tian, Timothy L Warren ... Michael S Brainard
Multi-area recordings reveal how communication between two songbird brain areas conveys a top-down bias that adaptively modifies ongoing singing to support context-specific vocal learning.
Ryan J Morrill, James Bigelow ... Andrea R Hasenstaub
The brain processes the same multisensory stimulus differently when the way it sounds, as opposed to the way it looks, is useful for making a decision.
W Marcus Lambert, Martin T Wells ... Linnie M Golightly
Mentorship, financial security and a positive sense of self-worth increase the likelihood that underrepresented minority and female postdocs will pursue a career in academia.
Allison T Goldstein, Terrence R Stanford, Emilio Salinas
The detection of a salient stimulus triggers a stereotypical oculomotor response, an impulse to look toward it, whose timing and strength are largely independent of behavioral significance and top-down control.
Hannah M Oberle, Alexander N Ford ... Pierre F Apostolides
Optogenetic and electrophysiological experiments show how a behaviorally relevant 'feedback' pathway from auditory cortex controls neural activity in the inferior colliculus, an auditory midbrain region important for processing complex time-varying sounds such as speech.
The novel neural marker for the integration of top-down predictions and bottom-up signals in perception elucidates uncertainty in perceptual inference and provides evidence for the predictive coding account of perception.
Both bottom-up and top-down processing are involved in the occipital-temporal face network, with the top-down modulation more extensively engaged when available information is sparse in the face images.