A generalizable brain extraction net (BEN) for multimodal MRI data from rodents, nonhuman primates, and humans

  1. Ziqi Yu
  2. Xiaoyang Han
  3. Wenjing Xu
  4. Jie Zhang
  5. Carsten Marr
  6. Dinggang Shen
  7. Tingying Peng  Is a corresponding author
  8. Xiao-Yong Zhang  Is a corresponding author
  9. Jianfeng Feng
  1. Fudan University, China
  2. Helmholtz Zentrum München, Germany
  3. ShanghaiTech University, China

Abstract

Accurate brain tissue extraction on magnetic resonance imaging (MRI) data is crucial for analyzing brain structure and function. While several conventional tools have been optimized to handle human brain data, there have been no generalizable methods to extract brain tissues for multimodal MRI data from rodents, nonhuman primates, and humans. Therefore, developing a flexible and generalizable method for extracting whole brain tissue across species would allow researchers to analyze and compare experiment results more efficiently. Here, we propose a domain-adaptive and semi-supervised deep neural network, named the Brain Extraction Net (BEN), to extract brain tissues across species, MRI modalities, and MR scanners. We have evaluated BEN on 18 independent datasets, including 783 rodent MRI scans, 246 nonhuman primate MRI scans, and 4,601 human MRI scans, covering five species, four modalities, and six MR scanners with various magnetic field strengths. Compared to conventional toolboxes, the superiority of BEN is illustrated by its robustness, accuracy, and generalizability. Our proposed method not only provides a generalized solution for extracting brain tissue across species but also significantly improves the accuracy of atlas registration, thereby benefiting the downstream processing tasks. As a novel fully automated deep-learning method, BEN is designed as an open-source software to enable high-throughput processing of neuroimaging data across species in preclinical and clinical applications.

Data availability

All data (MRI data, source codes, pretrained weights and replicate demo notebooks for Figure 1-7) are included in the manuscript or available at https://github.com/yu02019/BEN.

The following data sets were generated
The following previously published data sets were used

Article and author information

Author details

  1. Ziqi Yu

    Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China
    Competing interests
    No competing interests declared.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-8201-5481
  2. Xiaoyang Han

    Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China
    Competing interests
    No competing interests declared.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-3007-6079
  3. Wenjing Xu

    Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China
    Competing interests
    No competing interests declared.
  4. Jie Zhang

    Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China
    Competing interests
    No competing interests declared.
  5. Carsten Marr

    Institute of AI for Health, Helmholtz Zentrum München, Neuherberg, Germany
    Competing interests
    No competing interests declared.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-2154-4552
  6. Dinggang Shen

    School of Biomedical Engineering, ShanghaiTech University, Shanghai, China
    Competing interests
    Dinggang Shen, is affiliated with Shanghai United Imaging Intelligence Co., Ltd. He has financial interests to declare..
  7. Tingying Peng

    Helmholtz AI, Helmholtz Zentrum München, Neuherberg, Germany
    For correspondence
    tingying.peng@helmholtz-muenchen.de
    Competing interests
    No competing interests declared.
  8. Xiao-Yong Zhang

    Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China
    For correspondence
    xiaoyong_zhang@fudan.edu.cn
    Competing interests
    No competing interests declared.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-8965-1077
  9. Jianfeng Feng

    Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China
    Competing interests
    No competing interests declared.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-5987-2258

Funding

National Natural Science Foundation of China (81873893,82171903,92043301)

  • Xiao-Yong Zhang

Fudan University (the Office of Global Partnerships (Key Projects Development Fund))

  • Xiao-Yong Zhang

Shanghai Municipal Science and Technology Major Project (No.2018SHZDZX01)

  • Xiao-Yong Zhang

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Animal experimentation: Partial rodent MRI data collection were approved by the Animal Care and Use Committee of Fudan University, China. The rest rodent data (Rat-T2WI-9.4T and Rat-EPI-9.4T datasets) are publicly available (CARMI: https://openneuro.org/datasets/ds002870/versions/1.0.0). Marmoset MRI data collection were approved by the Animal Care and Use Committee of the Institute of Neuroscience, Chinese Academy of Sciences, China. Macaque MRI data are publicly available from the nonhuman PRIMatE Data Exchange (PRIME-DE) (https://fcon_1000.projects.nitrc.org/indi/indiPRIME.html).

Human subjects: The Zhangjiang International Brain Biobank (ZIB) protocols were approved by the Ethics Committee of Fudan University (AF/SC-03/20200722) and written informed consents were obtained from all volunteers. UK Biobank (UKB) and Adolescent Brain Cognitive Development (ABCD) are publicly available.

Reviewing Editor

  1. Saad Jbabdi, University of Oxford, United Kingdom

Publication history

  1. Received: June 20, 2022
  2. Accepted: December 21, 2022
  3. Accepted Manuscript published: December 22, 2022 (version 1)

Copyright

© 2022, Yu et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 162
    Page views
  • 38
    Downloads
  • 0
    Citations

Article citation count generated by polling the highest count across the following sources: Crossref, PubMed Central, Scopus.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Ziqi Yu
  2. Xiaoyang Han
  3. Wenjing Xu
  4. Jie Zhang
  5. Carsten Marr
  6. Dinggang Shen
  7. Tingying Peng
  8. Xiao-Yong Zhang
  9. Jianfeng Feng
(2022)
A generalizable brain extraction net (BEN) for multimodal MRI data from rodents, nonhuman primates, and humans
eLife 11:e81217.
https://doi.org/10.7554/eLife.81217

Further reading

    1. Computational and Systems Biology
    2. Neuroscience
    Zhe Chen, Garrett J Blair ... Hugh T Blair
    Tools and Resources

    Epifluorescence miniature microscopes ('miniscopes') are widely used for in vivo calcium imaging of neural population activity. Imaging data is typically collected during a behavioral task and stored for later offline analysis, but emerging techniques for online imaging can support novel closed-loop experiments in which neural population activity is decoded in real time to trigger neurostimulation or sensory feedback. To achieve short feedback latencies, online imaging systems must be optimally designed to maximize computational speed and efficiency while minimizing errors in population decoding. Here we introduce DeCalciOn, an open-source device for real-time imaging and population decoding of in vivo calcium signals that is hardware compatible with all miniscopes that use the UCLA Data Acquisition (DAQ) interface. DeCalciOn performs online motion stabilization, neural enhancement, calcium trace extraction, and decoding of up to 1024 traces per frame at latencies of <50 ms after fluorescence photons arrive at the miniscope image sensor. We show that DeCalciOn can accurately decode the position of rats (n=12) running on a linear track from calcium fluorescence in the hippocampal CA1 layer, and can categorically classify behaviors performed by rats (n=2) during an instrumental task from calcium fluorescence in orbitofrontal cortex (OFC). DeCalciOn achieves high decoding accuracy at short latencies using innovations such as field-programmable gate array (FPGA) hardware for real time image processing and contour-free methods to efficiently extract calcium traces from sensor images. In summary, our system offers an affordable plug-and-play solution for real-time calcium imaging experiments in behaving animals.

    1. Computational and Systems Biology
    2. Immunology and Inflammation
    Anastasia O Smirnova, Anna M Miroshnichenkova ... Alexander Komkov
    Tools and Resources

    High-throughput sequencing of adaptive immune receptor repertoires is a valuable tool for receiving insights in adaptive immunity studies. Several powerful TCR/BCR repertoire reconstruction and analysis methods have been developed in the past decade. However, detecting and correcting the discrepancy between real and experimentally observed lymphocyte clone frequencies is still challenging. Here we discovered a hallmark anomaly in the ratio between read count and clone count-based frequencies of non-functional clonotypes in multiplex PCR-based immune repertoires. Calculating this anomaly, we formulated a quantitative measure of V- and J-genes frequency bias driven by multiplex PCR during library preparation called Over Amplification Rate (OAR). Based on the OAR concept, we developed an original software for multiplex PCR-specific bias evaluation and correction named iROAR: Immune Repertoire Over Amplification Removal (https://github.com/smiranast/iROAR). The iROAR algorithm was successfully tested on previously published TCR repertoires obtained using both 5' RACE (Rapid Amplification of cDNA Ends)-based and multiplex PCR-based approaches and compared with a biological spike-in-based method for PCR bias evaluation. The developed approach can increase the accuracy and consistency of repertoires reconstructed by different methods making them more applicable for comparative analysis.