We report improved automated open-source methodology for head-fixed mesoscale cortical imaging and/or behavioral training of home cage mice using Raspberry Pi-based hardware. Staged partial and probabilistic restraint allows mice to adjust to self-initiated headfixation over 3 weeks' time with ~50% participation rate. We support a cue-based behavioral licking task monitored by a capacitive touch-sensor water spout. While automatically head-fixed, we acquire spontaneous, movement-triggered, or licking task-evoked GCaMP6 cortical signals. An analysis pipeline marked both behavioral events, as well as analyzed brain fluorescence signals as they relate to spontaneous and/or task-evoked behavioral activity. Mice were trained to suppress licking and wait for cues that marked the delivery of water. Correct rewarded go-trials were associated with widespread activation of midline and lateral barrel cortex areas following a vibration cue and delayed frontal and lateral motor cortex activation. Cortical GCaMP signals predicted trial success and correlated strongly with trial-outcome dependent body movements.
- Timothy H Murphy
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Animal experimentation: All procedures were conducted with approval from the University of British Columbia Animal Care Committee and in accordance with guidelines set forth by the Canadian Council for Animal Care.
- David Kleinfeld, University of California, San Diego, United States
- Received: February 12, 2020
- Accepted: May 7, 2020
- Accepted Manuscript published: May 15, 2020 (version 1)
© 2020, Murphy et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.