Visualizing anatomically registered data with Brainrender

  1. Federico Claudi  Is a corresponding author
  2. Adam L Tyson
  3. Luigi Petrucco
  4. Troy W Margrie
  5. Ruben Portugues
  6. Tiago Branco  Is a corresponding author
  1. UCL, United Kingdom
  2. Technical University of Munich, Germany

Abstract

Three-dimensional (3D) digital brain atlases and high-throughput brain wide imaging techniques generate large multidimensional datasets that can be registered to a common reference frame. Generating insights from such datasets depends critically on visualization and interactive data exploration, but this a challenging task. Currently available software is dedicated to single atlases, model species or data types, and generating 3D renderings that merge anatomically registered data from diverse sources requires extensive development and programming skills. Here, we present brainrender: an open-source Python package for interactive visualization of multidimensional datasets registered to brain atlases. Brainrender facilitates the creation of complex renderings with different data types in the same visualization and enables seamless use of different atlas sources. High-quality visualizations can be used interactively and exported as high-resolution figures and animated videos. By facilitating the visualization of anatomically registered data, brainrender should accelerate the analysis, interpretation, and dissemination of brain-wide multidimensional data.

Data availability

All code has been deposited on GitHub and is freely accessible.

The following previously published data sets were used

Article and author information

Author details

  1. Federico Claudi

    Sainsbury Wellcome Centre, UCL, London, United Kingdom
    For correspondence
    federico.claudi.17@ucl.ac.uk
    Competing interests
    The authors declare that no competing interests exist.
  2. Adam L Tyson

    Sainsbury Wellcome Centre, UCL, London, United Kingdom
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-3225-1130
  3. Luigi Petrucco

    Institute of Neuroscience, Technical University of Munich, Munich, Germany
    Competing interests
    The authors declare that no competing interests exist.
  4. Troy W Margrie

    Sainsbury Wellcome Centre, UCL, London, United Kingdom
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-5526-4578
  5. Ruben Portugues

    Institute of Neuroscience, Technical University of Munich, Munich, Germany
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-1495-9314
  6. Tiago Branco

    Sainsbury Wellcome Centre, UCL, London, United Kingdom
    For correspondence
    t.branco@ucl.ac.uk
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-5087-3465

Funding

Gatsby Charitable Foundation (GAT3361)

  • Troy W Margrie
  • Tiago Branco

Wellcome (214333/Z/18/Z)

  • Troy W Margrie

Wellcome (214352/Z/18/Z)

  • Tiago Branco

Wellcome (090843/F/09/Z)

  • Troy W Margrie
  • Tiago Branco

Deutsche Forschungsgemeinschaft (390857198)

  • Ruben Portugues

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Copyright

© 2021, Claudi et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 10,620
    views
  • 887
    downloads
  • 98
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Federico Claudi
  2. Adam L Tyson
  3. Luigi Petrucco
  4. Troy W Margrie
  5. Ruben Portugues
  6. Tiago Branco
(2021)
Visualizing anatomically registered data with Brainrender
eLife 10:e65751.
https://doi.org/10.7554/eLife.65751

Share this article

https://doi.org/10.7554/eLife.65751

Further reading

    1. Neuroscience
    Cameron T Ellis, Tristan S Yates ... Nicholas Turk-Browne
    Research Article

    Studying infant minds with movies is a promising way to increase engagement relative to traditional tasks. However, the spatial specificity and functional significance of movie-evoked activity in infants remains unclear. Here, we investigated what movies can reveal about the organization of the infant visual system. We collected fMRI data from 15 awake infants and toddlers aged 5–23 months who attentively watched a movie. The activity evoked by the movie reflected the functional profile of visual areas. Namely, homotopic areas from the two hemispheres responded similarly to the movie, whereas distinct areas responded dissimilarly, especially across dorsal and ventral visual cortex. Moreover, visual maps that typically require time-intensive and complicated retinotopic mapping could be predicted, albeit imprecisely, from movie-evoked activity in both data-driven analyses (i.e. independent component analysis) at the individual level and by using functional alignment into a common low-dimensional embedding to generalize across participants. These results suggest that the infant visual system is already structured to process dynamic, naturalistic information and that fine-grained cortical organization can be discovered from movie data.

    1. Neuroscience
    Gaqi Tu, Peiying Wen ... Kaori Takehara-Nishiuchi
    Research Article

    Outcomes can vary even when choices are repeated. Such ambiguity necessitates adjusting how much to learn from each outcome by tracking its variability. The medial prefrontal cortex (mPFC) has been reported to signal the expected outcome and its discrepancy from the actual outcome (prediction error), two variables essential for controlling the learning rate. However, the source of signals that shape these coding properties remains unknown. Here, we investigated the contribution of cholinergic projections from the basal forebrain because they carry precisely timed signals about outcomes. One-photon calcium imaging revealed that as mice learned different probabilities of threat occurrence on two paths, some mPFC cells responded to threats on one of the paths, while other cells gained responses to threat omission. These threat- and omission-evoked responses were scaled to the unexpectedness of outcomes, some exhibiting a reversal in response direction when encountering surprising threats as opposed to surprising omissions. This selectivity for signed prediction errors was enhanced by optogenetic stimulation of local cholinergic terminals during threats. The enhanced threat-evoked cholinergic signals also made mice erroneously abandon the correct choice after a single threat that violated expectations, thereby decoupling their path choice from the history of threat occurrence on each path. Thus, acetylcholine modulates the encoding of surprising outcomes in the mPFC to control how much they dictate future decisions.