For perceptual inference, human observers do not estimate sensory uncertainty instantaneously from the current sensory signals alone, but by combining past and current sensory inputs consistent with a Bayesian learner.
Temporal uncertainty interferes with the timely onset of evidence accumulation in perceptual decision making prompting the brain to rely instead on statistical regularities in the temporal structure of the environment.
Neural correlates of somatosensory target detection are restricted to secondary somatosensory cortex, whereas activity in insular, cingulate, and motor regions reflects stimulus uncertainty and overt reports.
The novel neural marker for the integration of top-down predictions and bottom-up signals in perception elucidates uncertainty in perceptual inference and provides evidence for the predictive coding account of perception.
Hierarchical modeling of internalizing symptoms and task performance reveals that difficulty adapting probabilistic learning to second-order uncertainty is common to anxiety and depression and holds across rewarding and punishing outcomes.
When Rhesus monkeys plan reaching movements of which they are not fully confident, a particular area of the brain represents both the chosen action as well as alternate movements, perhaps as an aid for error correction or learning.