Ching Fang, Dmitriy Aronov ... Emily L Mackevicius
A recurrent network using a simple, biologically plausible learning rule can learn the successor representation, suggesting that long-horizon predictions are computations that are easily accessible in neural circuits.
A close approximation to the successor representation is learnt by a simple spike-time-dependent learning rule between cells undergoing theta phase precession.