The BigBrainWarp toolbox for integration of BigBrain 3D histology with multimodal neuroimaging
Abstract
Neuroimaging stands to benefit from emerging ultrahigh-resolution 3D histological atlases of the human brain; the first of which is 'BigBrain'. Here, we review recent methodological advances for the integration of BigBrain with multi-modal neuroimaging and introduce a toolbox, 'BigBrainWarp', that combines these developments. The aim of BigBrainWarp is to simplify workflows and support the adoption of best practices. This is accomplished with a simple wrapper function that allows users to easily map data between BigBrain and standard MRI spaces. The function automatically pulls specialised transformation procedures, based on ongoing research from a wide collaborative network of researchers. Additionally, the toolbox improves accessibility of histological information through dissemination of ready-to-use cytoarchitectural features. Finally, we demonstrate the utility of BigBrainWarp with three tutorials and discuss the potential of the toolbox to support multi-scale investigations of brain organisation.
Data availability
All data generated or analysed during this study are included in the BigBrainWarp repository (https://github.com/caseypaquola/BigBrainWarp).
Article and author information
Author details
Funding
Helmholtz Association
- Casey Paquola
- Lindsay B Lewis
- Claude Lepage
- Jordan DeKraker
- Paule-Joanne Toussaint
- Sofie Louise Valk
- D Louis Collins
- Katrin Amunts
- Alan C Evans
- Timo Dickscheid
- Boris C Bernhardt
Fonds de Recherche du Québec - Santé
- Casey Paquola
- Boris C Bernhardt
National Science and Engineering Research Council of Canada
- Ali Khan
- Boris C Bernhardt
Canadian Institutes of Health Research
- Jessica Royer
- Ali Khan
- Boris C Bernhardt
SickKids Foundation
- Boris C Bernhardt
Azrieli Center for Autism Research
- Boris C Bernhardt
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Copyright
© 2021, Paquola et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 2,793
- views
-
- 382
- downloads
-
- 53
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
Combining electrophysiological, anatomical and functional brain maps reveals networks of beta neural activity that align with dopamine uptake.
-
- Neuroscience
During rest and sleep, memory traces replay in the brain. The dialogue between brain regions during replay is thought to stabilize labile memory traces for long-term storage. However, because replay is an internally-driven, spontaneous phenomenon, it does not have a ground truth - an external reference that can validate whether a memory has truly been replayed. Instead, replay detection is based on the similarity between the sequential neural activity comprising the replay event and the corresponding template of neural activity generated during active locomotion. If the statistical likelihood of observing such a match by chance is sufficiently low, the candidate replay event is inferred to be replaying that specific memory. However, without the ability to evaluate whether replay detection methods are successfully detecting true events and correctly rejecting non-events, the evaluation and comparison of different replay methods is challenging. To circumvent this problem, we present a new framework for evaluating replay, tested using hippocampal neural recordings from rats exploring two novel linear tracks. Using this two-track paradigm, our framework selects replay events based on their temporal fidelity (sequence-based detection), and evaluates the detection performance using each event's track discriminability, where sequenceless decoding across both tracks is used to quantify whether the track replaying is also the most likely track being reactivated.