TY - JOUR TI - DeepFly3D, a deep learning-based approach for 3D limb and appendage tracking in tethered, adult Drosophila AU - Günel, Semih AU - Rhodin, Helge AU - Morales, Daniel AU - Campagnolo, João AU - Ramdya, Pavan AU - Fua, Pascal A2 - O'Leary, Timothy A2 - Calabrese, Ronald L A2 - Shaevitz, Josh W VL - 8 PY - 2019 DA - 2019/10/04 SP - e48571 C1 - eLife 2019;8:e48571 DO - 10.7554/eLife.48571 UR - https://doi.org/10.7554/eLife.48571 AB - Studying how neural circuits orchestrate limbed behaviors requires the precise measurement of the positions of each appendage in three-dimensional (3D) space. Deep neural networks can estimate two-dimensional (2D) pose in freely behaving and tethered animals. However, the unique challenges associated with transforming these 2D measurements into reliable and precise 3D poses have not been addressed for small animals including the fly, Drosophila melanogaster. Here, we present DeepFly3D, a software that infers the 3D pose of tethered, adult Drosophila using multiple camera images. DeepFly3D does not require manual calibration, uses pictorial structures to automatically detect and correct pose estimation errors, and uses active learning to iteratively improve performance. We demonstrate more accurate unsupervised behavioral embedding using 3D joint angles rather than commonly used 2D pose data. Thus, DeepFly3D enables the automated acquisition of Drosophila behavioral measurements at an unprecedented level of detail for a variety of biological applications. KW - 3D pose estimation KW - animal behavior KW - deep learning KW - computer vision KW - unsupervised classification JF - eLife SN - 2050-084X PB - eLife Sciences Publications, Ltd ER -