Decisions about the behavioral significance of sensory stimuli often require comparing sensory inference of what we are looking at to internal models of what we are looking for. Here, we test how neuronal selectivity for visual features is transformed into decision-related signals in posterior parietal cortex (area LIP). Monkeys performed a visual matching task that required them to detect target stimuli composed of conjunctions of color and motion-direction. Neuronal recordings from area LIP revealed two main findings. First, the sequential processing of visual features and the selection of target-stimuli suggest that LIP is involved in transforming sensory information into decision-related signals. Second, the patterns of color and motion selectivity and their impact on decision-related encoding suggest that LIP plays a role in detecting target stimuli by comparing bottom-up sensory inputs (what the monkeys were looking at) and top-down cognitive encoding inputs (what the monkeys were looking for).
- David J Freedman
- David J Freedman
- David J Freedman
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Animal experimentation: This study was performed in strict accordance with the recommendations in the Guide for the Care and Use of Laboratory Animals of the National Institutes of Health. All experimental procedures were in accordance with the University of Chicago Animal Care and Use Committee (IACUC), protocol #71887, of the University of Chicago and National Institutes of Health guidelines.
- Tatiana Pasternak, University of Rochester, United States
© 2017, Ibos & Freedman
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Downloads (link to download the article as PDF)
Download citations (links to download the citations from this article in formats compatible with various reference manager tools)
Open citations (links to open the citations from this article in various online reference manager services)
Body weight is regulated by interoceptive neural circuits that track energy need, but how the activity of these circuits is altered in obesity remains poorly understood. Here we describe the in vivo dynamics of hunger-promoting AgRP neurons during the development of diet-induced obesity in mice. We show that high-fat diet attenuates the response of AgRP neurons to an array of nutritionally-relevant stimuli including food cues, intragastric nutrients, cholecystokinin and ghrelin. These alterations are specific to dietary fat but not carbohydrate or protein. Subsequent weight loss restores the responsiveness of AgRP neurons to exterosensory cues but fails to rescue their sensitivity to gastrointestinal hormones or nutrients. These findings reveal that obesity triggers broad dysregulation of hypothalamic hunger neurons that is incompletely reversed by weight loss and may contribute to the difficulty of maintaining a reduced weight.
Previously, in (Hermundstad et al., 2014), we showed that when sampling is limiting, the efficient coding principle leads to a 'variance is salience' hypothesis, and that this hypothesis accounts for visual sensitivity to binary image statistics. Here, using extensive new psychophysical data and image analysis, we show that this hypothesis accounts for visual sensitivity to a large set of grayscale image statistics at a striking level of detail, and also identify the limits of the prediction. We define a 66-dimensional space of local grayscale light-intensity correlations, and measure the relevance of each direction to natural scenes. The 'variance is salience' hypothesis predicts that two-point correlations are most salient, and predicts their relative salience. We tested these predictions in a texture-segregation task using un-natural, synthetic textures. As predicted, correlations beyond second order are not salient, and predicted thresholds for over 300 second-order correlations match psychophysical thresholds closely (median fractional error < 0:13).