Abstract

Zebrafish larvae show characteristic prey capture behavior in response to small moving objects. The neural mechanism used to recognize objects as prey remains largely unknown. We devised a machine learning behavior classification system to quantify hunting kinematics in semi-restrained animals exposed to a range of virtual stimuli. Two-photon calcium imaging revealed a small visual area, AF7, which was activated specifically by the optimal prey stimulus. This pretectal region is innervated by two types of retinal ganglion cells, which also send collaterals to the optic tectum. Laser ablation of AF7 markedly reduced prey capture behavior. We identified neurons with arbors in AF7 and found that they projected to multiple sensory and premotor areas: the optic tectum, the nucleus of the medial longitudinal fasciculus (nMLF) and the hindbrain. These findings indicate that computations in the retina give rise to a visual stream which transforms sensory information into a directed prey capture response.