Globular bushy cells (GBCs) of the cochlear nucleus play central roles in the temporal processing of sound. Despite investigation over many decades, fundamental questions remain about their dendrite structure, afferent innervation, and integration of synaptic inputs. Here, we use volume electron microscopy (EM) of the mouse cochlear nucleus to construct synaptic maps that precisely specify convergence ratios and synaptic weights for auditory- nerve innervation and accurate surface areas of all postsynaptic compartments. Detailed biophysically-based compartmental models can help develop hypotheses regarding how GBCs integrate inputs to yield their recorded responses to sound. We established a pipeline to export a precise reconstruction of auditory nerve axons and their endbulb terminals together with high-resolution dendrite, soma, and axon reconstructions into biophysically-detailed compartmental models that could be activated by a standard cochlear transduction model. With these constraints, the models predict auditory nerve input profiles whereby all endbulbs onto a GBC are subthreshold (coincidence detection mode), or one or two inputs are suprathreshold (mixed mode). The models also predict the relative importance of dendrite geometry, soma size, and axon initial segment length in setting action potential threshold and generating heterogeneity in sound-evoked responses, and thereby propose mechanisms by which GBCs may homeostatically adjust their excitability. Volume EM also reveals new dendritic structures and dendrites that lack innervation. This framework defines a pathway from subcellular morphology to synaptic connectivity, and facilitates investigation into the roles of specific cellular features in sound encoding. We also clarify the need for new experimental measurements to provide missing cellular parameters, and predict responses to sound for further in vivo studies, thereby serving as a template for investigation of other neuron classes.
The serial blockface electron microscope volume will be uploaded to BossDB (bossdb.org).The modeling code is publicly available on GitHub (https://github.com/pbmanis/vcnmodel; https://github.com/cnmodel).The main Simulation result files used to generate the figures in this manuscript have been uploaded to Dryad, and can be accessed at https://doi.org/10.5061/dryad.4j0zpc8g1. This repository includes:Simulation figures and figure panels can be generated using the DataTables script in the VCNModel package after downloading the simulation result files. All simulations shown in the paper, and/or their analyses, are included in the Dryad repository. They can be regenerated from the VCNModel package (above, on GitHub) using supplied scripts.Code and Data for Figure2-Figure Supplement 1 is in the file Figure2_Suppl1.py in the VCNModel GitHub repository.Code and data for Figure5-Figure Supplement 2 is in pattern_summary.py in the VCNModel GitHub repository.Figures 1E, F, 2C, D, 3C,D,F,G, 7H and K, 8H were generated using Matlab code. The tables (Excel) and Matlab code are at www.github.com/gaspirou/pub_file_share.
Data from: High-resolution volumetric imaging constrains compartmental models to explore synaptic integration and temporal processing by cochlear nucleus globular bushy cellsDryad Digital Repository, doi:10.5061/dryad.4j0zpc8g1.
- George A Spirou
- Mark H Ellisman
- Paul B Manis
- Paul B Manis
- Mark H Ellisman
- Mark H Ellisman
The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
Animal experimentation: All procedures involving animals were approved by the West Virginia University (WVU) InstitutionalAnimal Care and Use Committee, protocol #15-1201 (G.A. Spirou, PI) and were in accordance with policies of the United States Public Health Service. No animal procedures in this study were performed at other institutions. The perfusion of the mouse was performed under avertin anesthesia.
- Catherine Emily Carr, University of Maryland, United States
© 2023, Spirou et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Biomedical single-cell atlases describe disease at the cellular level. However, analysis of this data commonly focuses on cell-type centric pairwise cross-condition comparisons, disregarding the multicellular nature of disease processes. Here we propose multicellular factor analysis for the unsupervised analysis of samples from cross-condition single-cell atlases and the identification of multicellular programs associated with disease. Our strategy, which repurposes group factor analysis as implemented in multi-omics factor analysis, incorporates the variation of patient samples across cell-types or other tissue-centric features, such as cell compositions or spatial relationships, and enables the joint analysis of multiple patient cohorts, facilitating the integration of atlases. We applied our framework to a collection of acute and chronic human heart failure atlases and described multicellular processes of cardiac remodeling, independent to cellular compositions and their local organization, that were conserved in independent spatial and bulk transcriptomics datasets. In sum, our framework serves as an exploratory tool for unsupervised analysis of cross-condition single-cell atlases and allows for the integration of the measurements of patient cohorts across distinct data modalities.
Pancreatic cancer is one of the deadliest cancer types with poor treatment options. Better detection of early symptoms and relevant disease correlations could improve pancreatic cancer prognosis. In this retrospective study, we used symptom and disease codes (ICD-10) from the Danish National Patient Registry (NPR) encompassing 6.9 million patients from 1994 to 2018,, of whom 23,592 were diagnosed with pancreatic cancer. The Danish cancer registry included 18,523 of these patients. To complement and compare the registry diagnosis codes with deeper clinical data, we used a text mining approach to extract symptoms from free text clinical notes in electronic health records (3078 pancreatic cancer patients and 30,780 controls). We used both data sources to generate and compare symptom disease trajectories to uncover temporal patterns of symptoms prior to pancreatic cancer diagnosis for the same patients. We show that the text mining of the clinical notes was able to complement the registry-based symptoms by capturing more symptoms prior to pancreatic cancer diagnosis. For example, ‘Blood pressure reading without diagnosis’, ‘Abnormalities of heartbeat’, and ‘Intestinal obstruction’ were not found for the registry-based analysis. Chaining symptoms together in trajectories identified two groups of patients with lower median survival (<90 days) following the trajectories ‘Cough→Jaundice→Intestinal obstruction’ and ‘Pain→Jaundice→Abnormal results of function studies’. These results provide a comprehensive comparison of the two types of pancreatic cancer symptom trajectories, which in combination can leverage the full potential of the health data and ultimately provide a fuller picture for detection of early risk factors for pancreatic cancer.