Peer review process
Not revised: This Reviewed Preprint includes the authors’ original preprint (without revision), an eLife assessment, and public reviews.
Read more about eLife’s peer review process.Editors
- Reviewing EditorLaura ColginUniversity of Texas at Austin, Austin, United States of America
- Senior EditorLaura ColginUniversity of Texas at Austin, Austin, United States of America
Reviewer #1 (Public Review):
Summary:
Horan et al. present a system for the chronic implantation of Neuropixels probes in mice and rats that allows the repeated cycles of implantation, explant, and reuse. A detailed protocol of the procedure, along with technical drawings for the parts of the system are provided, for potential users to undertake the technique in their own laboratory. The authors documented the adoption of this system in ten laboratories, demonstrating that the technique can be widely deployed. Yields in the number of neurons recorded over time are reported to indicate that the technique can achieve stable yields over time.
Strengths:
The authors provide compelling evidence that their technique can be widely deployed and acquired by different laboratories by documenting in detail the success rates at each step of the procedure and the common failure modes across ten laboratories. This is important because an impediment for a laboratory to try out a new technique is a lack of assurance about whether that technique would be successful outside the environment where the technique was originally developed. It is helpful that the authors show that even users who were not directly trained by the original developer of the technique can acquire the technique by receiving only the protocol and the technical drawings.
Weaknesses:
I would have liked to see more evidence demonstrating the purported advantages of the Repix design ("We found that the key advantage of Repix is robustness and simplicity.") relative to other techniques already available for chronic implantation allowing for reuse (Juavinett 2019, Luo 2020, van Daal 2021, Bimbard 2023, Melin 2023). While it is commendable that the authors demonstrate the durability of their design during social interactions, I would have liked to see evidence demonstrating that aluminum construction (compared to plastic) is necessary for "rough-and-tumble fights of male mice."
Aluminum parts are typically more expensive than plastic parts, and because machining aluminum parts is typically slower than 3D printing in plastic, the commitment to aluminum can greatly slow down the adaptation of the Repix design for specific experimental needs or for newer versions of Neuropixels probes to be released in the future. Also, as the authors stated, aluminum parts are a bit heavier than plastic parts. In addition, I remain not fully convinced that the Repix design is significantly simpler than the existing designs, and I would be more convinced if the authors could quantify the number of modular components of the Repix system relative to existing designs, or perhaps provide a time estimate of assembling a Repix system compared to assembling an existing design.
The possibility of achieving greater yield using dexamethasone is intriguing, but the authors only show this for rats and one brain region. Were the surgeries done using dexamethasone performed after the surgeries not using dexamethasone? If so, could the improved yield simply be due to improvement in surgical technique? As such, it remains unclear whether dexamethasone actually helps to achieve greater yields.
Reviewer #2 (Public Review):
Summary:
This report describes a new "Repix" device for collecting stable, long-term recordings from chronically implanted Neuropixels probes in freely behaving rodents. The device follows the "docking module with payload" design of other similar devices that allows probe explantation and reuse but requires minimal components and is robust to a wide range of rodent behaviors. The docking module is a set of metal posts that are screwed into the payload module (cassette carrying the probe) at one end and cemented to the skull of the animal during surgery at the other end to reversibly anchor the probe to the skull. Loosening of the screws allows the cassette to travel off the posts for explantation. An additional headstage holder and cover are also available for further protection of the implant from mechanical damage during freely moving behaviors. Usage data from almost 200 procedures across multiple labs and users showcase high success rates at all stages of implementation (implantation, data collection, and explantation), even from users without direct training from the original developer of Repix. Device proficiency, defined by the authors as three successive full procedures without failure, was typically achieved within five attempts. Hundreds of neurons were consistently recorded from multiple brain regions, irrespective of animal behavior, Neuropixels probe type, and probe reuse. Impressively, neurophysiological data using Repix has already been published in two studies (one in mice and the other in rats). These findings demonstrate the intended functioning of the device as well as its ease of adoption. The effort to make the Repix system as straightforward as possible (e.g., minimal components and detailed protocols) is evident and will likely be appreciated by new adopters. Furthermore, the cell yield and procedures-to-proficiency data collected from a variety of experiments provide useful data for new adopters to plan their own studies with realistic expectations.
Strengths:
The main claims that the Repix device is "reliable, reusable, [and] versatile" are well-supported.
Weaknesses:
(1) The methodology used to quantify cell yields is concerning, potentially leading to an overestimation of "good" units and a misleading amount of "total" units. The authors define "good" unit yield as the amount of simultaneously recorded neurons labeled "good" by the automated spike sorter Kilosort without post-hoc manual curation. This definition was used to standardize cell yield between users who would otherwise manually curate cells and introduce individual variability as to what is considered a "good" unit. However, manual curation of spike sorted output is typically necessary to eliminate false positive units and "merge" spikes belonging to the same neuron that Kilosort identified as belonging to two separate neurons (i.e., spikes that share a refractory period, waveform shape, and localized to the same channels). As such, one may reasonably expect the yield for actual "good" units to be lower than what is reported. Furthermore, including units labeled by Kilosort as multi-unit activity in the "total" yield does not lend itself, by definition, to accurate quantification of individual neurons.
(2) For transparency's sake, restatement of whether the cell yield data came from mice or rats, and from one lab or multiple labs, in the figure or figure captions would be helpful. Based on the introduction of the paper, one gets the impression that the Repix system was designed for mice and rats and, therefore, that data from mice and rats were to be roughly equally represented. This is not the case, as only 1/3 of the reported Repix users were implanted in rats, and cell yield data was shown for only two brain regions in rats (compared with four in mice). The authors state that Repix was designed "... to record neural activities during social interaction of mice" in the Discussion section. It would be helpful for this statement to appear in the Introduction so that it is clear to the reader that Repix was designed for mice but also works well for rats.
(3) Regarding Figure 2, it would be informative to separate this data by species. Does Repix fail more in a procedural stage depending on whether the user is working with mice or rats?
Reviewer #3 (Public Review):
Summary:
Recent work in systems neuroscience has highlighted the importance of studying the populations of neurons during naturalistic behaviors, which necessitates the use of cutting-edge devices in freely moving animals. However, it has been costly and experimentally difficult to conduct such experiments. In response to this need, Horan et al. developed and thoroughly tested a system called Repix which allows neuroscientists to record from multiple brain areas in freely moving rodents over many days, even weeks. The authors show that this device enables reasonably stable long-term recordings and that the probe can be reused for different experiments.
Strengths:
I deeply appreciated how thoroughly the authors have tested this across labs and different versions of Neuropixels probes (and even other probes). This is unlike many other papers that describe similar devices, which have almost always only been developed and tested in one lab. As such, I think that the Repix device and procedure are very likely to be adopted by even more labs given the robustness of the evidence provided here. The willingness of the authors to allow others to test their device, iterate on the design, and obtain feedback from users is a shining example of how open science and publication should be conducted: with patience and diligence. I'm grateful to the authors for providing this example to the research community.
On a related note, in the discussion, the authors nicely summarize their focus on ease-of-adoption and highlight other examples from the community that have been successful. I would encourage the authors to think about what else - culturally, economically, etc. -- has been helpful in the open science adoption of software and hardware for electrophysiology, and to think critically about what these movements are still lacking or missing. Given the authors' collective experience in this effort, I believe the broader community would benefit from their perspective.
The final strength of this manuscript is the highly detailed protocol that has itself been peer-reviewed by many users and can be adapted for multiple use cases. The authors also provide specific protocols from individual labs in the main manuscript.
Weaknesses:
(1) Claims about longevity. Given the clear drop-off in units in the amygdala and V1, I felt that the claims about long-term stability (particularly at the one-year mark) were oversold. Readers should note the differences between the length of the curves in Figure 3B, and take these differences into consideration when setting expectations on the durability of these probes for recordings in V1 or the amygdala (and possibly nearby areas).
(2) Clarity around curve fitting, statistics, and impact of surgical procedures. I believe the manuscript could benefit from more detail around the curve fitting that was implemented, as well as some of the statistical tests, particularly related to the dexamethasone experiments. It seems the authors fit exponential decay to the unit curves over time, but it is not clear that this kind of fit makes sense given the data, which is a bit hard to see. Relatedly, there is a claim on page 10 about the similarity between mouse and rat decay constants in the amygdala which is hard to evaluate without quantitative evidence.
It is very useful to know that dexamethasone (an anti-inflammatory used by many labs) could improve stability, however, a more thorough explanation of these experiments is warranted. For example, it should be noted that the dexamethasone animals start with a much higher unit yield. Also, the decay in Figure 5e looks similar between dex and non-dex animals despite the claims in the text that the "decay of unit numbers was slower." Additional details about the curve fitting and statistical tests are needed for readers to evaluate this claim.