The representations of information concerning the number, size, density and surface of sets of objects in a visual image are separable along the occipito-parietal cortex and independently modulated by attention.
Implementing neural changes associated with attention in a deep neural network causes performance changes that mimic those observed in humans and macaques.
Faced with multiple sources of sound, humans can better perceive all of a target sound's features when one of those features changes in time with a visual stimulus.
A combined behavioural and electroencephalographic approach investigating the covert allocation of attention shows evidence for distributed periodic sampling away from a conscious visual image.
Spatial attention and saccadic processing co-ordinate to ensure that attention is available at a task-relevant location soon after the beginning of each eye fixation.
Looking away from a salient visual stimulus pits the reflexive urge to look toward it against the voluntary intention not to, and this conflict is resolved within tens of milliseconds.