Automated task training and longitudinal monitoring of mouse mesoscale cortical circuits using home cages
Abstract
We report improved automated open-source methodology for head-fixed mesoscale cortical imaging and/or behavioral training of home cage mice using Raspberry Pi-based hardware. Staged partial and probabilistic restraint allows mice to adjust to self-initiated headfixation over 3 weeks' time with ~50% participation rate. We support a cue-based behavioral licking task monitored by a capacitive touch-sensor water spout. While automatically head-fixed, we acquire spontaneous, movement-triggered, or licking task-evoked GCaMP6 cortical signals. An analysis pipeline marked both behavioral events, as well as analyzed brain fluorescence signals as they relate to spontaneous and/or task-evoked behavioral activity. Mice were trained to suppress licking and wait for cues that marked the delivery of water. Correct rewarded go-trials were associated with widespread activation of midline and lateral barrel cortex areas following a vibration cue and delayed frontal and lateral motor cortex activation. Cortical GCaMP signals predicted trial success and correlated strongly with trial-outcome dependent body movements.
Data availability
The name of each brain imaging file which contains both the mouse ID and the time stamp can be found in the SQL database (RFIDtag_xxxx_timestamp.raw; see Methods for URL and hosted as a full text file archive on Zenodo (https://doi.org/10.5281/zenodo.3268838) for the 5 cages of male mice that compose figures 1-7 and cage 6 female mice https://doi.org/10.5683/SP2/9RFXRP.All text file behavioral data is included online as well as image data for figures 8 and supplemental figure 2 are found on https://doi.org/10.5281/zenodo.3243572, all data files and code for figures 9 and 10 are found in https://doi.org/10.5683/SP2/ZTOPUM and female mouse behavioral data https://doi.org/10.5683/SP2/9RFXRP. All Python data acquisition code can be found on https://github.com/jamieboyd/AutoHeadFix/ and https://github.com/ubcbraincircuits/AutoHeadFix.
-
Code and .mat files for automated homecage paper (Figs 9 and 10)https://doi.org/10.5683/SP2/ZTOPUM.
-
Home cage data female mice cage 6 all text file and database informationhttps://doi.org/10.5683/SP2/9RFXRP.
Article and author information
Author details
Funding
Canadian Institutes of Health Research (FDN-143209)
- Timothy H Murphy
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Ethics
Animal experimentation: All procedures were conducted with approval from the University of British Columbia Animal Care Committee and in accordance with guidelines set forth by the Canadian Council for Animal Care.
Copyright
© 2020, Murphy et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 4,954
- views
-
- 506
- downloads
-
- 30
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
Our propensity to materiality, which consists in using, making, creating, and passing on technologies, has enabled us to shape the physical world according to our ends. To explain this proclivity, scientists have calibrated their lens to either low-level skills such as motor cognition or high-level skills such as language or social cognition. Yet, little has been said about the intermediate-level cognitive processes that are directly involved in mastering this materiality, that is, technical cognition. We aim to focus on this intermediate level for providing new insights into the neurocognitive bases of human materiality. Here, we show that a technical-reasoning process might be specifically at work in physical problem-solving situations. We found via two distinct neuroimaging studies that the area PF (parietal F) within the left parietal lobe is central for this reasoning process in both tool-use and non-tool-use physical problem-solving and can work along with social-cognitive skills to resolve day-to-day interactions that combine social and physical constraints. Our results demonstrate the existence of a specific cognitive module in the human brain dedicated to materiality, which might be the supporting pillar allowing the accumulation of technical knowledge over generations. Intensifying research on technical cognition could nurture a comprehensive framework that has been missing in fields interested in how early and modern humans have been interacting with the physical world through technology, and how this interaction has shaped our history and culture.
-
- Neuroscience
The question as to whether animals taste cholesterol taste is not resolved. This study investigates whether the fruit fly, Drosophila melanogaster, is capable of detecting cholesterol through their gustatory system. We found that flies are indifferent to low levels of cholesterol and avoid higher levels. The avoidance is mediated by gustatory receptor neurons (GRNs), demonstrating that flies can taste cholesterol. The cholesterol-responsive GRNs comprise a subset that also responds to bitter substances. Cholesterol detection depends on five ionotropic receptor (IR) family members, and disrupting any of these genes impairs the flies' ability to avoid cholesterol. Ectopic expressions of these IRs in GRNs reveals two classes of cholesterol receptors, each with three shared IRs and one unique subunit. Additionally, expressing cholesterol receptors in sugar-responsive GRNs confers attraction to cholesterol. This study reveals that flies can taste cholesterol, and that the detection depends on IRs in GRNs.