Receptor-driven, multimodal mapping of cortical areas in the macaque monkey intraparietal sulcus
Abstract
The intraparietal sulcus (IPS) is structurally and functionally heterogeneous. We performed a quantitative cyto- and receptor architectonical analysis to provide a multimodal map of the macaque IPS. We identified 17 cortical areas, including novel areas PEipe, PEipi (external and internal subdivisions of PEip), and MIPd. Multivariate analyses of receptor densities resulted in a grouping of areas based on the degree of (dis)similarity of their receptor architecture: a cluster encompassing areas located in the posterior portion of the IPS and associated mainly with the processing of visual information, a cluster including areas found in the anterior portion of the IPS and involved in sensorimotor processing, and an 'intermediate' cluster of multimodal association areas. Thus, differences in cyto- and receptor architecture segregate the cortical ribbon within the IPS, and receptor fingerprints provide novel insights into the relationship between the structural and functional segregation of this brain region in the macaque monkey.
Data availability
All data generated or analysed during this study are included in the manuscript and supporting files.
Article and author information
Author details
Funding
European Commission (785907)
- Nicola Palomero-Gallagher
- Karl Zilles
Bundesministerium für Bildung und Forschung (01GQ1902)
- Nicola Palomero-Gallagher
European Commission (945539)
- Nicola Palomero-Gallagher
- Karl Zilles
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Ethics
Animal experimentation: The present study did not include experimental procedures with live animals. Brains were obtained when animals were sacrificed to reduce the size of the colony, where they were maintained in accordance with the guidelines of the Directive 2010/63/eu of the European Parliament and of the Council on the protection of animals used for scientific purposes.
Reviewing Editor
- Timothy E Behrens, University of Oxford, United Kingdom
Publication history
- Received: February 12, 2020
- Accepted: July 1, 2020
- Accepted Manuscript published: July 2, 2020 (version 1)
- Version of Record published: July 16, 2020 (version 2)
- Version of Record updated: June 18, 2021 (version 3)
Copyright
© 2020, Niu et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 1,035
- Page views
-
- 176
- Downloads
-
- 8
- Citations
Article citation count generated by polling the highest count across the following sources: Crossref, PubMed Central, Scopus.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
Gustatory sensory neurons detect caloric and harmful compounds in potential food and convey this information to the brain to inform feeding decisions. To examine the signals that gustatory neurons transmit and receive, we reconstructed gustatory axons and their synaptic sites in the adult Drosophila melanogaster brain, utilizing a whole-brain electron microscopy volume. We reconstructed 87 gustatory projections from the proboscis labellum in the right hemisphere and 57 from the left, representing the majority of labellar gustatory axons. Gustatory neurons contain a nearly equal number of interspersed pre-and post-synaptic sites, with extensive synaptic connectivity among gustatory axons. Morphology- and connectivity-based clustering revealed six distinct groups, likely representing neurons recognizing different taste modalities. The vast majority of synaptic connections are between neurons of the same group. This study resolves the anatomy of labellar gustatory projections, reveals that gustatory projections are segregated based on taste modality, and uncovers synaptic connections that may alter the transmission of gustatory signals.
-
- Neuroscience
Categorization of everyday objects requires that humans form representations of shape that are tolerant to variations among exemplars. Yet, how such invariant shape representations develop remains poorly understood. By comparing human infants (6–12 months; N=82) to computational models of vision using comparable procedures, we shed light on the origins and mechanisms underlying object perception. Following habituation to a never-before-seen object, infants classified other novel objects across variations in their component parts. Comparisons to several computational models of vision, including models of high-level and low-level vision, revealed that infants’ performance was best described by a model of shape based on the skeletal structure. Interestingly, infants outperformed a range of artificial neural network models, selected for their massive object experience and biological plausibility, under the same conditions. Altogether, these findings suggest that robust representations of shape can be formed with little language or object experience by relying on the perceptually invariant skeletal structure.