Scratch-AID: a deep-learning based system for automatic detection of mouse scratching behavior with high accuracy

  1. Huasheng Yu  Is a corresponding author
  2. Jingwei Xiong
  3. Adam Yongxin Ye
  4. Suna Li Cranfill
  5. Tariq Cannonier
  6. Mayank Gautam
  7. Marina Zhang
  8. Rayan Bilal
  9. Jong-Eun Park
  10. Yuji Xue
  11. Vidhur Polam
  12. Zora Vujovic
  13. Daniel Dai
  14. William Ong
  15. Jasper Ip
  16. Amanda Hsieh
  17. Nour Mimouni
  18. Alejandra Lozada
  19. Medhini Sosale
  20. Alex Ahn
  21. Minghong Ma
  22. Long Ding
  23. Javier Arsuaga
  24. Wenqin Luo  Is a corresponding author
  1. University of Pennsylvania, United States
  2. University of California, Davis, United States
  3. Howard Hughes Medical Institute, Harvard Medical School, United States
  4. Massachusetts Institute of Technology, United States

Abstract

Mice are the most commonly used model animals for itch research and for development of anti-itch drugs. Most labs manually quantify mouse scratching behavior to assess itch intensity. This process is labor-intensive and limits large-scale genetic or drug screenings. In this study, we developed a new system, Scratch-AID Automatic Itch Detection), which could automatically identify and quantify mouse scratching behavior with high accuracy. Our system included a custom-designed videotaping box to ensure high-quality and replicable mouse behavior recording and a convolutional recurrent neural network (CRNN) trained with frame-labeled mouse scratching behavior videos, induced by nape injection of chloroquine (CQ). The best trained network achieved 97.6% recall and 96.9% precision on previously unseen test videos. Remarkably, Scratch-AID could reliably identify scratching behavior in other major mouse itch models, including the acute cheek model, the histaminergic model, and a chronic itch model. Moreover, our system detected significant differences in scratching behavior between control and mice treated with an anti-itch drug. Taken together, we have established a novel deep learning-based system that is ready to replace manual quantification for mouse scratching behavior in different itch models and for drug screening.

Data availability

The training and test videos generated during the current study can be downloaded from DRYAD (https://doi.org/10.5061/dryad.mw6m9060s). The codes for model training and test can be downloaded from GitHub (https://github.com/taimeimiaole/Scratch-AID)

The following data sets were generated

Article and author information

Author details

  1. Huasheng Yu

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    For correspondence
    huasheng.yu@pennmedicine.upenn.edu
    Competing interests
    The authors declare that no competing interests exist.
  2. Jingwei Xiong

    Graduate Group in Biostatistics, University of California, Davis, Davis, United States
    Competing interests
    The authors declare that no competing interests exist.
  3. Adam Yongxin Ye

    Program in Cellular and Molecular Medicine, Howard Hughes Medical Institute, Harvard Medical School, Boston, United States
    Competing interests
    The authors declare that no competing interests exist.
  4. Suna Li Cranfill

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-3431-0061
  5. Tariq Cannonier

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  6. Mayank Gautam

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-7257-5837
  7. Marina Zhang

    Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Cambridge, United States
    Competing interests
    The authors declare that no competing interests exist.
  8. Rayan Bilal

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  9. Jong-Eun Park

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  10. Yuji Xue

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  11. Vidhur Polam

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  12. Zora Vujovic

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  13. Daniel Dai

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  14. William Ong

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  15. Jasper Ip

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-9773-1544
  16. Amanda Hsieh

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  17. Nour Mimouni

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  18. Alejandra Lozada

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  19. Medhini Sosale

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  20. Alex Ahn

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  21. Minghong Ma

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
  22. Long Ding

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-1716-3848
  23. Javier Arsuaga

    Graduate Group in Biostatistics, University of California, Davis, Davis, United States
    Competing interests
    The authors declare that no competing interests exist.
  24. Wenqin Luo

    Department of Neuroscience, University of Pennsylvania, Philadelphia, United States
    For correspondence
    luow@pennmedicine.upenn.edu
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-2486-807X

Funding

National Science Foundation (DMS-1854770)

  • Javier Arsuaga

National Institutes of Health (R01 NS083702)

  • Wenqin Luo

National Institutes of Health (R34 NS118411)

  • Long Ding
  • Wenqin Luo

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Animal experimentation: Mice were housed in the John Morgan animal facility at the University of Pennsylvania. All animal treatments were conducted in accordance with protocols approved by the Institutional Animal Care and Use Committee and the guidelines of the National Institutes of Health (Protocol No. 804886).

Copyright

© 2022, Yu et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 3,292
    views
  • 365
    downloads
  • 3
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Huasheng Yu
  2. Jingwei Xiong
  3. Adam Yongxin Ye
  4. Suna Li Cranfill
  5. Tariq Cannonier
  6. Mayank Gautam
  7. Marina Zhang
  8. Rayan Bilal
  9. Jong-Eun Park
  10. Yuji Xue
  11. Vidhur Polam
  12. Zora Vujovic
  13. Daniel Dai
  14. William Ong
  15. Jasper Ip
  16. Amanda Hsieh
  17. Nour Mimouni
  18. Alejandra Lozada
  19. Medhini Sosale
  20. Alex Ahn
  21. Minghong Ma
  22. Long Ding
  23. Javier Arsuaga
  24. Wenqin Luo
(2022)
Scratch-AID: a deep-learning based system for automatic detection of mouse scratching behavior with high accuracy
eLife 11:e84042.
https://doi.org/10.7554/eLife.84042

Share this article

https://doi.org/10.7554/eLife.84042

Further reading

    1. Neuroscience
    Mohsen Alavash
    Insight

    Combining electrophysiological, anatomical and functional brain maps reveals networks of beta neural activity that align with dopamine uptake.

    1. Neuroscience
    Masahiro Takigawa, Marta Huelin Gorriz ... Daniel Bendor
    Research Article

    During rest and sleep, memory traces replay in the brain. The dialogue between brain regions during replay is thought to stabilize labile memory traces for long-term storage. However, because replay is an internally-driven, spontaneous phenomenon, it does not have a ground truth - an external reference that can validate whether a memory has truly been replayed. Instead, replay detection is based on the similarity between the sequential neural activity comprising the replay event and the corresponding template of neural activity generated during active locomotion. If the statistical likelihood of observing such a match by chance is sufficiently low, the candidate replay event is inferred to be replaying that specific memory. However, without the ability to evaluate whether replay detection methods are successfully detecting true events and correctly rejecting non-events, the evaluation and comparison of different replay methods is challenging. To circumvent this problem, we present a new framework for evaluating replay, tested using hippocampal neural recordings from rats exploring two novel linear tracks. Using this two-track paradigm, our framework selects replay events based on their temporal fidelity (sequence-based detection), and evaluates the detection performance using each event's track discriminability, where sequenceless decoding across both tracks is used to quantify whether the track replaying is also the most likely track being reactivated.