Chimpanzee brain morphometry utilizing standardized MRI preprocessing and macroanatomical annotations

  1. Sam Vickery  Is a corresponding author
  2. William D Hopkins
  3. Chet C Sherwood
  4. Steven J Schapiro
  5. Robert D Latzman
  6. Svenja Caspers
  7. Christian Gaser
  8. Simon B Eickhoff
  9. Robert Dahnke  Is a corresponding author
  10. Felix Hoffstaedter  Is a corresponding author
  1. Research Centre Jülich, Germany
  2. MD Anderson Center, United States
  3. The George Washington University, United States
  4. Georgia State University, United States
  5. University of Jena, Germany
  6. Jena University Hospital, Germany

Abstract

Chimpanzees are among the closest living relatives to humans and, as such, provide a crucial comparative model for investigating primate brain evolution. In recent years, human brain mapping has strongly benefited from enhanced computational models and image processing pipelines that could also improve data analyses in animals by using species-specific templates. In this study, we use structural MRI data from the National Chimpanzee Brain Resource (NCBR) to develop the chimpanzee brain reference template Juna.Chimp for spatial registration and the macro-anatomical brain parcellation Davi130 for standardized whole-brain analysis. Additionally, we introduce a ready-to-use image processing pipeline built upon the CAT12 toolbox in SPM12, implementing a standard human image preprocessing framework in chimpanzees. Applying this approach to data from 194 subjects, we find strong evidence for human-like age-related gray matter atrophy in multiple regions of the chimpanzee brain, as well as, a general rightward asymmetry in brain regions.

Data availability

The T1-weighted MRI's can are available at the National Chimpanzee Brain Resource Website as well as the direct-to-download dataset we used for our example workflow.The code used in the manuscript can be found at this GitHub repo https://github.com/viko18/JunaChimp

Article and author information

Author details

  1. Sam Vickery

    Institute of Neuroscience and Medicine (INM-7: Brain and Behaviour), Research Centre Jülich, Jülich, Germany
    For correspondence
    s.vickery@fz-juelich.de
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-6732-7014
  2. William D Hopkins

    MD Anderson Center, Bastrop, United States
    Competing interests
    The authors declare that no competing interests exist.
  3. Chet C Sherwood

    Department of Anthropology, The George Washington University, Washington, DC, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-6711-449X
  4. Steven J Schapiro

    MD Anderson Center, Bastrop, United States
    Competing interests
    The authors declare that no competing interests exist.
  5. Robert D Latzman

    Psychology, Georgia State University, Atlanta, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-1175-8090
  6. Svenja Caspers

    Institute of Neuroscience and Medicine (INM-1), Research Centre Jülich, Jülich, Germany
    Competing interests
    The authors declare that no competing interests exist.
  7. Christian Gaser

    University of Jena, Jena, Germany
    Competing interests
    The authors declare that no competing interests exist.
  8. Simon B Eickhoff

    Institute of Neuroscience and Medicine (INM-7: Brain and Behaviour), Research Centre Jülich, Jülich, Germany
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-6363-2759
  9. Robert Dahnke

    Department of Neurolgy; Department of Psychiatry and Psychotherapy, Jena University Hospital, Jena, Germany
    For correspondence
    robert.dahnke@uni-jena.de
    Competing interests
    The authors declare that no competing interests exist.
  10. Felix Hoffstaedter

    Institute of Neuroscience and Medicine (INM-7: Brain and Behaviour), Research Centre Jülich, Jülich, Germany
    For correspondence
    f.hoffstaedter@fz-juelich.de
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-7163-3110

Funding

Helmholtz Association (Helmholtz Portfolio Theme 'Supercomputing and Modelling for the Human Brain)

  • Sam Vickery
  • Simon B Eickhoff
  • Felix Hoffstaedter

Deutsche Forschungsgemeinschaft (417649423)

  • Robert Dahnke

European Commission Horizon 2020 (945539 (HBP SGA 3))

  • Sam Vickery
  • Simon B Eickhoff
  • Felix Hoffstaedter

Helmholtz Association (Initiative and Networking Fund)

  • Svenja Caspers

European Commission Horizon 2020 (785907 (HBP SGA 2))

  • Svenja Caspers

National Institutes of Health (NS-42867,NS-73134,NS-92988)

  • William D Hopkins

National Institutes of Health (NS092988)

  • Chet C Sherwood

James S. McDonnell Foundation (220020293)

  • Chet C Sherwood

Inspire Foundation (SMA-1542848)

  • Chet C Sherwood

National Institutes of Health (U42-OD011197)

  • Steven J Schapiro

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Reviewing Editor

  1. Jonathan Erik Peelle, Washington University in St. Louis, United States

Ethics

Animal experimentation: The chimpanzee imaging data were acquired under protocols approved by the Yerkes National Primate Research Center (YNPRC) at Emory University Institutional Animal Care and Use Committee (Approval number YER2001206).

Version history

  1. Received: June 17, 2020
  2. Accepted: November 20, 2020
  3. Accepted Manuscript published: November 23, 2020 (version 1)
  4. Version of Record published: December 8, 2020 (version 2)

Copyright

© 2020, Vickery et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 2,949
    views
  • 212
    downloads
  • 21
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Sam Vickery
  2. William D Hopkins
  3. Chet C Sherwood
  4. Steven J Schapiro
  5. Robert D Latzman
  6. Svenja Caspers
  7. Christian Gaser
  8. Simon B Eickhoff
  9. Robert Dahnke
  10. Felix Hoffstaedter
(2020)
Chimpanzee brain morphometry utilizing standardized MRI preprocessing and macroanatomical annotations
eLife 9:e60136.
https://doi.org/10.7554/eLife.60136

Share this article

https://doi.org/10.7554/eLife.60136

Further reading

    1. Neuroscience
    Jack W Lindsey, Elias B Issa
    Research Article

    Object classification has been proposed as a principal objective of the primate ventral visual stream and has been used as an optimization target for deep neural network models (DNNs) of the visual system. However, visual brain areas represent many different types of information, and optimizing for classification of object identity alone does not constrain how other information may be encoded in visual representations. Information about different scene parameters may be discarded altogether (‘invariance’), represented in non-interfering subspaces of population activity (‘factorization’) or encoded in an entangled fashion. In this work, we provide evidence that factorization is a normative principle of biological visual representations. In the monkey ventral visual hierarchy, we found that factorization of object pose and background information from object identity increased in higher-level regions and strongly contributed to improving object identity decoding performance. We then conducted a large-scale analysis of factorization of individual scene parameters – lighting, background, camera viewpoint, and object pose – in a diverse library of DNN models of the visual system. Models which best matched neural, fMRI, and behavioral data from both monkeys and humans across 12 datasets tended to be those which factorized scene parameters most strongly. Notably, invariance to these parameters was not as consistently associated with matches to neural and behavioral data, suggesting that maintaining non-class information in factorized activity subspaces is often preferred to dropping it altogether. Thus, we propose that factorization of visual scene information is a widely used strategy in brains and DNN models thereof.

    1. Neuroscience
    Zhaoran Zhang, Huijun Wang ... Kunlin Wei
    Research Article

    The sensorimotor system can recalibrate itself without our conscious awareness, a type of procedural learning whose computational mechanism remains undefined. Recent findings on implicit motor adaptation, such as over-learning from small perturbations and fast saturation for increasing perturbation size, challenge existing theories based on sensory errors. We argue that perceptual error, arising from the optimal combination of movement-related cues, is the primary driver of implicit adaptation. Central to our theory is the increasing sensory uncertainty of visual cues with increasing perturbations, which was validated through perceptual psychophysics (Experiment 1). Our theory predicts the learning dynamics of implicit adaptation across a spectrum of perturbation sizes on a trial-by-trial basis (Experiment 2). It explains proprioception changes and their relation to visual perturbation (Experiment 3). By modulating visual uncertainty in perturbation, we induced unique adaptation responses in line with our model predictions (Experiment 4). Overall, our perceptual error framework outperforms existing models based on sensory errors, suggesting that perceptual error in locating one’s effector, supported by Bayesian cue integration, underpins the sensorimotor system’s implicit adaptation.