Visualizing anatomically registered data with Brainrender
Abstract
Three-dimensional (3D) digital brain atlases and high-throughput brain wide imaging techniques generate large multidimensional datasets that can be registered to a common reference frame. Generating insights from such datasets depends critically on visualization and interactive data exploration, but this a challenging task. Currently available software is dedicated to single atlases, model species or data types, and generating 3D renderings that merge anatomically registered data from diverse sources requires extensive development and programming skills. Here, we present brainrender: an open-source Python package for interactive visualization of multidimensional datasets registered to brain atlases. Brainrender facilitates the creation of complex renderings with different data types in the same visualization and enables seamless use of different atlas sources. High-quality visualizations can be used interactively and exported as high-resolution figures and animated videos. By facilitating the visualization of anatomically registered data, brainrender should accelerate the analysis, interpretation, and dissemination of brain-wide multidimensional data.
Data availability
All code has been deposited on GitHub and is freely accessible.
Article and author information
Author details
Funding
Gatsby Charitable Foundation (GAT3361)
- Troy W Margrie
- Tiago Branco
Wellcome (214333/Z/18/Z)
- Troy W Margrie
Wellcome (214352/Z/18/Z)
- Tiago Branco
Wellcome (090843/F/09/Z)
- Troy W Margrie
- Tiago Branco
Deutsche Forschungsgemeinschaft (390857198)
- Ruben Portugues
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Reviewing Editor
- Mackenzie W Mathis, EPFL, Switzerland
Publication history
- Received: December 15, 2020
- Accepted: March 17, 2021
- Accepted Manuscript published: March 19, 2021 (version 1)
- Version of Record published: April 27, 2021 (version 2)
- Version of Record updated: May 13, 2021 (version 3)
Copyright
© 2021, Claudi et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 7,760
- Page views
-
- 687
- Downloads
-
- 28
- Citations
Article citation count generated by polling the highest count across the following sources: Scopus, PubMed Central, Crossref.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
The treatment of neurodegenerative diseases is hindered by lack of interventions capable of steering multimodal whole-brain dynamics towards patterns indicative of preserved brain health. To address this problem, we combined deep learning with a model capable of reproducing whole-brain functional connectivity in patients diagnosed with Alzheimer’s disease (AD) and behavioral variant frontotemporal dementia (bvFTD). These models included disease-specific atrophy maps as priors to modulate local parameters, revealing increased stability of hippocampal and insular dynamics as signatures of brain atrophy in AD and bvFTD, respectively. Using variational autoencoders, we visualized different pathologies and their severity as the evolution of trajectories in a low-dimensional latent space. Finally, we perturbed the model to reveal key AD- and bvFTD-specific regions to induce transitions from pathological to healthy brain states. Overall, we obtained novel insights on disease progression and control by means of external stimulation, while identifying dynamical mechanisms that underlie functional alterations in neurodegeneration.
-
- Neuroscience
Previous research has associated alpha-band [8–12 Hz] oscillations with inhibitory functions: for instance, several studies showed that visual attention increases alpha-band power in the hemisphere ipsilateral to the attended location. However, other studies demonstrated that alpha oscillations positively correlate with visual perception, hinting at different processes underlying their dynamics. Here, using an approach based on traveling waves, we demonstrate that there are two functionally distinct alpha-band oscillations propagating in different directions. We analyzed EEG recordings from three datasets of human participants performing a covert visual attention task (one new dataset with N = 16, two previously published datasets with N = 16 and N = 31). Participants were instructed to detect a brief target by covertly attending to the screen’s left or right side. Our analysis reveals two distinct processes: allocating attention to one hemifield increases top-down alpha-band waves propagating from frontal to occipital regions ipsilateral to the attended location, both with and without visual stimulation. These top-down oscillatory waves correlate positively with alpha-band power in frontal and occipital regions. Yet, different alpha-band waves propagate from occipital to frontal regions and contralateral to the attended location. Crucially, these forward waves were present only during visual stimulation, suggesting a separate mechanism related to visual processing. Together, these results reveal two distinct processes reflected by different propagation directions, demonstrating the importance of considering oscillations as traveling waves when characterizing their functional role.