Human Neocortical Neurosolver (HNN), a new software tool for interpreting the cellular and network origin of human MEG/EEG data

  1. Samuel A Neymotin  Is a corresponding author
  2. Dylan S Daniels
  3. Blake Caldwell
  4. Robert A McDougal
  5. Nicholas T Carnevale
  6. Mainak Jas
  7. Christopher I Moore
  8. Michael L Hines
  9. Matti Hämäläinen
  10. Stephanie R Jones  Is a corresponding author
  1. Brown University, United States
  2. Yale University, United States
  3. Massachusetts General Hospital, United States

Abstract

Magneto- and electro-encephalography (MEG/EEG) non-invasively record human brain activity with millisecond resolution providing reliable markers of healthy and disease states. Relating these macroscopic signals to underlying cellular- and circuit-level generators is a limitation that constrains using MEG/EEG to reveal novel principles of information processing or to translate findings into new therapies for neuropathology. To address this problem, we built Human Neocortical Neurosolver (HNN, https://hnn.brown.edu) software. HNN has a graphical user interface designed to help researchers and clinicians interpret the neural origins of MEG/EEG. HNN's core is a neocortical circuit model that accounts for biophysical origins of electrical currents generating MEG/EEG. Data can be directly compared to simulated signals and parameters easily manipulated to develop/test hypotheses on a signal's origin. Tutorials teach users to simulate commonly measured signals, including event related potentials and brain rhythms. HNN's ability to associate signals across scales makes it a unique tool for translational neuroscience research.

Data availability

All source-code, model parameters, and associated data are provided in a permanent public-accessible repository on github (https://github.com/jonescompneurolab/hnn).

Article and author information

Author details

  1. Samuel A Neymotin

    Department of Neuroscience, Brown University, Providence, United States
    For correspondence
    samuel.neymotin@nki.rfmh.org
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-3646-5195
  2. Dylan S Daniels

    Department of Neuroscience, Brown University, Providence, United States
    Competing interests
    The authors declare that no competing interests exist.
  3. Blake Caldwell

    Department of Neuroscience, Brown University, Providence, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-6882-6998
  4. Robert A McDougal

    Department of Neuroscience, Yale University, New Haven, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-6394-3127
  5. Nicholas T Carnevale

    Department of Neuroscience, Yale University, New Haven, United States
    Competing interests
    The authors declare that no competing interests exist.
  6. Mainak Jas

    Athinoula A Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Charlestown, United States
    Competing interests
    The authors declare that no competing interests exist.
  7. Christopher I Moore

    Department of Neuroscience, Brown University, Providence, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-4534-1602
  8. Michael L Hines

    Department of Neuroscience, Yale University, New Haven, United States
    Competing interests
    The authors declare that no competing interests exist.
  9. Matti Hämäläinen

    Athinoula A Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Charlestown, United States
    Competing interests
    The authors declare that no competing interests exist.
  10. Stephanie R Jones

    Department of Neuroscience, Brown University, Providence, United States
    For correspondence
    Stephanie_Jones@brown.edu
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-6760-5301

Funding

National Institute of Biomedical Imaging and Bioengineering (BRAIN Award 5-R01-EB022889-02)

  • Samuel A Neymotin
  • Dylan S Daniels
  • Blake Caldwell
  • Robert A McDougal
  • Nicholas T Carnevale
  • Mainak Jas
  • Christopher I Moore
  • Michael L Hines
  • Matti Hämäläinen
  • Stephanie R Jones

National Institute of Biomedical Imaging and Bioengineering (BRAIN Award Supplement R01EB022889-02S1)

  • Samuel A Neymotin
  • Dylan S Daniels
  • Blake Caldwell
  • Robert A McDougal
  • Nicholas T Carnevale
  • Mainak Jas
  • Christopher I Moore
  • Michael L Hines
  • Matti Hämäläinen
  • Stephanie R Jones

National Institute on Deafness and Other Communication Disorders (5-R01DC012947-07)

  • Samuel A Neymotin

Army Research Office (W911NF-19-1-0402)

  • Samuel A Neymotin

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication. The views and conclusions contained in this document are those of the authorsand should not be interpreted as representing the official policies, either expressed or implied, of the Army Research Office or the U.S. Government. The U.S. Government is authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright notation herein.

Copyright

© 2020, Neymotin et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 7,135
    views
  • 913
    downloads
  • 76
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Samuel A Neymotin
  2. Dylan S Daniels
  3. Blake Caldwell
  4. Robert A McDougal
  5. Nicholas T Carnevale
  6. Mainak Jas
  7. Christopher I Moore
  8. Michael L Hines
  9. Matti Hämäläinen
  10. Stephanie R Jones
(2020)
Human Neocortical Neurosolver (HNN), a new software tool for interpreting the cellular and network origin of human MEG/EEG data
eLife 9:e51214.
https://doi.org/10.7554/eLife.51214

Share this article

https://doi.org/10.7554/eLife.51214

Further reading

    1. Neuroscience
    Geoffrey W Meissner, Allison Vannan ... FlyLight Project Team
    Research Article

    Techniques that enable precise manipulations of subsets of neurons in the fly central nervous system (CNS) have greatly facilitated our understanding of the neural basis of behavior. Split-GAL4 driver lines allow specific targeting of cell types in Drosophila melanogaster and other species. We describe here a collection of 3060 lines targeting a range of cell types in the adult Drosophila CNS and 1373 lines characterized in third-instar larvae. These tools enable functional, transcriptomic, and proteomic studies based on precise anatomical targeting. NeuronBridge and other search tools relate light microscopy images of these split-GAL4 lines to connectomes reconstructed from electron microscopy images. The collections are the result of screening over 77,000 split hemidriver combinations. Previously published and new lines are included, all validated for driver expression and curated for optimal cell-type specificity across diverse cell types. In addition to images and fly stocks for these well-characterized lines, we make available 300,000 new 3D images of other split-GAL4 lines.

    1. Neuroscience
    Hyun Jee Lee, Jingting Liang ... Hang Lu
    Research Advance

    Cell identification is an important yet difficult process in data analysis of biological images. Previously, we developed an automated cell identification method called CRF_ID and demonstrated its high performance in Caenorhabditis elegans whole-brain images (Chaudhary et al., 2021). However, because the method was optimized for whole-brain imaging, comparable performance could not be guaranteed for application in commonly used C. elegans multi-cell images that display a subpopulation of cells. Here, we present an advancement, CRF_ID 2.0, that expands the generalizability of the method to multi-cell imaging beyond whole-brain imaging. To illustrate the application of the advance, we show the characterization of CRF_ID 2.0 in multi-cell imaging and cell-specific gene expression analysis in C. elegans. This work demonstrates that high-accuracy automated cell annotation in multi-cell imaging can expedite cell identification and reduce its subjectivity in C. elegans and potentially other biological images of various origins.