Constructing and optimizing 3D atlases from 2D data with application to the developing mouse brain
Abstract
3D imaging data necessitate 3D reference atlases for accurate quantitative interpretation. Existing computational methods to generate 3D atlases from 2D-derived atlases result in extensive artifacts, while manual curation approaches are labor-intensive. We present a computational approach for 3D atlas construction that substantially reduces artifacts by identifying anatomical boundaries in the underlying imaging data and using these to guide 3D transformation. Anatomical boundaries also allow extension of atlases to complete edge regions. Applying these methods to the eight developmental stages in the Allen Developing Mouse Brain Atlas (ADMBA) led to more comprehensive and accurate atlases. We generated imaging data from fifteen whole mouse brains to validate atlas performance and observed qualitative and quantitative improvement (37% greater alignment between atlas and anatomical boundaries). We provide the pipeline as the MagellanMapper software and the eight 3D reconstructed ADMBA atlases. These resources facilitate whole-organ quantitative analysis between samples and across development.
Data availability
The full 3D generated atlases and wild-type brain images are being deposited with the Human Brain Project EBRAINS data platform. All data analyses are included in the manuscript and supporting files.
-
E11.5 3D Edge-Aware Refined Atlas Derived from the Allen Developing Mouse Brain AtlasEBRAINS , 10.25493/H9A3-GFT.
-
E13.5 3D Edge-Aware Refined Atlas Derived from the Allen Developing Mouse Brain AtlasEBRAINS , 10.25493/YP9K-YMW.
-
E15.5 3D Edge-Aware Refined Atlas Derived from the Allen Developing Mouse Brain AtlasEBRAINS , 10.25493/EXET-XND.
-
E18.5 3D Edge-Aware Refined Atlas Derived from the Allen Developing Mouse Brain AtlasEBRAINS , 10.25493/X4ZT-ARE.
-
P4 3D Edge-Aware Refined Atlas Derived from the Allen Developing Mouse Brain AtlasEBRAINS , 10.25493/QYP4-5VQ.
-
P14 3D Edge-Aware Refined Atlas Derived from the Allen Developing Mouse Brain AtlasEBRAINS , 10.25493/QYP4-5VQ.
-
P28 3D Edge-Aware Refined Atlas Derived from the Allen Developing Mouse Brain AtlasEBRAINS , 10.25493/YW1E-6BW.
-
P56 3D Edge-Aware Refined Atlas Derived from the Allen Developing Mouse Brain AtlasEBRAINS , 10.25493/MYPD-QB8.
Article and author information
Author details
Funding
Brain and Behavior Research Foundation (NARSAD Young Investigator Grant)
- Stephan J Sanders
National Institute of Mental Health (U01 MH122681)
- Stephan J Sanders
National Institute of Mental Health (R01 MH109901)
- Stephan J Sanders
National Institute of Neurological Disorders and Stroke (R01 NS099099)
- John LR Rubenstein
The authors declare that the funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Ethics
Animal experimentation: All procedures and animal care were approved and performed in accordance with institutional guidelines from the University of California San Francisco Laboratory Animal Research Center (LARC). All animal handling complied with the approved Institutional Animal Care and Use Committee (IACUC) protocol (AN180174-02) at the University of California San Francisco.
Copyright
© 2021, Young et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 4,471
- views
-
- 331
- downloads
-
- 20
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Computational and Systems Biology
- Neuroscience
Audiovisual information reaches the brain via both sustained and transient input channels, representing signals’ intensity over time or changes thereof, respectively. To date, it is unclear to what extent transient and sustained input channels contribute to the combined percept obtained through multisensory integration. Based on the results of two novel psychophysical experiments, here we demonstrate the importance of the transient (instead of the sustained) channel for the integration of audiovisual signals. To account for the present results, we developed a biologically inspired, general-purpose model for multisensory integration, the multisensory correlation detectors, which combines correlated input from unimodal transient channels. Besides accounting for the results of our psychophysical experiments, this model could quantitatively replicate several recent findings in multisensory research, as tested against a large collection of published datasets. In particular, the model could simultaneously account for the perceived timing of audiovisual events, multisensory facilitation in detection tasks, causality judgments, and optimal integration. This study demonstrates that several phenomena in multisensory research that were previously considered unrelated, all stem from the integration of correlated input from unimodal transient channels.
-
- Computational and Systems Biology
Live-cell microscopy routinely provides massive amounts of time-lapse images of complex cellular systems under various physiological or therapeutic conditions. However, this wealth of data remains difficult to interpret in terms of causal effects. Here, we describe CausalXtract, a flexible computational pipeline that discovers causal and possibly time-lagged effects from morphodynamic features and cell–cell interactions in live-cell imaging data. CausalXtract methodology combines network-based and information-based frameworks, which is shown to discover causal effects overlooked by classical Granger and Schreiber causality approaches. We showcase the use of CausalXtract to uncover novel causal effects in a tumor-on-chip cellular ecosystem under therapeutically relevant conditions. In particular, we find that cancer-associated fibroblasts directly inhibit cancer cell apoptosis, independently from anticancer treatment. CausalXtract uncovers also multiple antagonistic effects at different time delays. Hence, CausalXtract provides a unique computational tool to interpret live-cell imaging data for a range of fundamental and translational research applications.