Ground texture, trajectories and retinal curl distributions across conditions.

(A) Snapshot of the ground texture based on simplex noise. Yellow lines indicate optic flow vectors computed using the Farnebäck algorithm. A clear rotational component (curl) is visible, consistent with the observer looking at a point located to the left of the simulated path (yellow dot). (B) Schematic of the three trajectories. Position (0, 0) represents the starting point of simulated locomotion. The five colored dots mark the fixation points in world coordinates at the beginning of each trial (20 m ahead of the observer). (C–E) Distributions of mean retinal curl across trials for left-, center-, and right-gaze conditions, respectively. Dark filled bars indicate the unaltered curl condition, while lighter bars and outlined steps represent the cancelled curl condition. The vertical grey line denotes zero curl.

Perceived heading

Reported instant directions are plotted when fixating eccentric points on the ground. Rows: fixation 2 m (top) and 4 m (bottom) to the side. Columns: different physical path conditions (centre column = straight path). Colours: initial gaze direction (left / centre / right). Note that on curved paths gaze might change the direction sign relative to heading (i.e. left curve and initially fixating on the left). Thick coloured lines denote the mean across observers. Thin coloured lines: denote individual observers. Dark grey: physical paths. Axes show lateral position (x-axis) versus depth position (y-axis)

Average perceived heading for each gaze condition (columns) and each physical path curvature (rows), separately for the three retinal–flow manipulation conditions: unaltered curl (yellow-green), cancelled curl (cyan), and over-cancelled curl (purple).

Shaded envelopes indicate between-observer variability (95%-CI). The right axis applies to the last column and illustrates the mean 2D displacement between observed and physical paths. This measurement indicates the mean displacement that is required for the observed path to align with the physical one. The displacement is shown for the different flow manipulations (color-coded). The error bars indicate between-observer variability (95%-CI).

Separate fits of the controller.

Average perceived heading across participants for each gaze condition (columns) and each physical path curvature (rows), separately for the three retinal–flow manipulation conditions (thiner solid lines): unaltered curl (yellow-green), cancelled curl (cyan), and over-cancelled curl (purple). The different red thicker solid lines denote the best fit of the controller. The numbers in each panel indicate the average lateral deviation per step between the fit and the observed heading. Note that for the centered gaze and straight path, only the unaltered flow condition was fitted.

Neural network model of gaze-contingent heading bias.

The parameters used to produce these panels are: I0 = 0.03, Kp = 0.4, σp = 0.18, σk = 0.12. (A) Ring attractor connectivity showing synaptic weight (W) as a function of relative preferred heading (Δϕ) in pixels, featuring a central excitatory peak and asymmetric inhibitory sourround. While we present results using asymmetric connectivity, no significant differences were observed between asymmetric and symmetric configurations. (B) Heatmap of neural activity across neuron preferred headings (y-axis) over the frame index (x-axis), with a dashed white line indicating the decoded population heading estimate for the trial in which gaze directed 4 m to the left. (C) Mechanism of sensory-prior competition plotting normalized amplitude against ring position in pixels, illustrating the spatial alignment of the straight-ahead prior (blue dotted line), gaze-centered inhibition (red dashed line), and the resulting neural activity bump (black solid line). (D) Phase portrait showing the change in heading (/dt) versus the heading estimate (θ) in pixels, with a horizontal line at zero marking the convergence of the trajectory toward a stable fixed point The color bar denotes frame number.