Abstract
Prevailing models aiming at explaining heading assume that humans need to recover the Focus of Expansion (FOE) while filtering out rotational flow (curl) caused by eye movements. We propose an alternative: the visual system utilizes retinal curl directly to estimate heading, rendering the explicit recovery of the FOE unnecessary. Stationary participants viewed simulated walking paths on a large screen while fixating on ground targets at varying eccentricities—a natural behavior inducing sustained retinal curl. Participants continuously reported perceived heading. To isolate the role of rotational flow, we employed a real-time manipulation that kept translational flow constant while the foveal curl component was either intact, cancelled, or overcancelled. Under natural conditions, participants exhibited systematic heading biases opposite the direction of gaze. Crucially, these biases vanished in the ‘cancelled curl’ condition, identifying retinal curl as the specific driver of perceptual bias. We modeled these results using a simple feedback controller and a ring-attractor neural network featuring gaze-contingent inhibition and a ‘straight-ahead’ prior. These findings suggest the brain exploits the geometry of gaze stabilization to simplify navigation, treating retinal curl as a functional signal rather than noise to be filtered.
Introduction
Humans and many vertebrates rely on stabilizing gaze on regions of interest in the world to achieve accurate perception. This fixation strategy, combined with the continuous eye, head, and body movements that accompany natural behavior, generates highly structured patterns of motion on the retina that can, in principle, be exploited to control locomotion. A particularly simple and influential case arises when we move approximately in the same direction as we are looking: in this situation the focus of expansion (FOE), the point in the optic flow from which motion vectors appear to radiate, directly specifies the heading direction (Gibson, 1950). The idea that the visual system uses the FOE to recover heading has subsequently dominated both theoretical and empirical work, including the interpretation of neural activity in primate medial superior temporal (MST) area, where neurons exhibit FOE-like tuning (Bremmer et al., 2017; Britten, 2008; Duffy and Wurtz, 1991; Kaminiarz et al., 2014), as well as psychophysical findings showing that humans can make highly accurate heading judgements from optic flow (Li and Warren, 2002; Li and Warren, 2000; Warren and Hannon, 1990; Warren and Hannon, 1988). Building on these observations, computational models have proposed population codes and neural network mechanisms (Beintema et al., 2004; Berg and Beintema, 2000; Heeger and Jepson, 1992; Lappe and Rauschecker, 1993; Perrone and Stone, 1994) that operate on radial flow structure to recover the translation direction within this FOE-centric framework.
However, this FOE-centric view faces several fundamental challenges. In natural environments, observers rarely maintain fixation on their heading direction; instead, they typically look toward behaviorally relevant areas, sometimes located eccentrically in the visual field (Matthis et al., 2018). Under combined eye–head movements, the retinal FOE no longer coincides with the true heading direction, creating a computational problem for relying on the FOE for heading estimation (Royden et al., 1994, 1992), because in this case the retinal flow contains both translational and rotational components. To preserve FOE-based control of navigation, two main strategies have been proposed to decompose the flow field: retinal approaches that attempt to detect and compensate for the rotational component directly from the retinal flow itself (e.g., (Cutting et al., 1992; Heeger and Jepson, 1992; Lappe and Rauschecker, 1993; Longuet-Higgins and Prazdny, 1980)), and extra-retinal approaches that rely on eye movement signals to factor out the rotation (Lappe, 1998; Royden et al., 1994).
Despite these challenges, a functional role of rotational flow components has been largely neglected in both theoretical and experimental work on heading perception. Although some models have included rotation sensitivity or extra-retinal signals to account for eye-movement compensation (Beintema and van den Berg, 1998; Koenderink and Van Doorn, 1981; Lappe, 1998), the curl component of optic flow has typically been treated as something to be removed rather than as a potential source of heading information in its own right. A few exceptions include studies that have considered motion-parallax cues, implying rotation, to determine instantaneous heading (Li and Warren, 2000) or to estimate gaze direction (Cutting, 1986).
Recent evidence suggests that this oversight may be consequential. In real walking scenarios, the FOE expressed in a head-centred reference frame is too variable to be used reliably for heading recovery, whereas the magnitude of retinal curl in the fovea can specify the body trajectory relative to gaze (Matthis et al., 2022). Psychophysical studies have also shown that heading estimates exhibit systematic biases that FOE-based mechanisms cannot fully account for under simulated rotation — i.e. when the retinal flow is indistinguishable from that produced by actual eye or head rotations but without accompanying vestibular or proprioceptive signals (Banks et al., 1996; van den Berg and Brenner, 1994). This bias suggests that rotation is not merely treated as noise, but instead provides information used in heading perception. In addition, neurophysiological work has identified that most MSTd neurons are sensitive to spiral motion (Graziano et al., 1994), consistent with a continuum of motion pattern sensitivity (Layton and Browning, 2014) rather than a decomposition into separate channels (e.g. expansion–contraction vs. rotation components). This continuum is consistent with the amount of rotational components in the retinal image as a function of gaze eccentricity, making gaze-stabilisation or fixation strategies critical for understanding the use of retinal flow patterns (Angelaki and Hess, 2005; Calow and Lappe, 2008; Glennerster et al., 2001).
In this work, we provide the empirical evidence that curl can serve as a control variable during locomotion. By relying on gaze stabilization, which generates a retinal curl component, we reveal a robust heading bias that emerges from prolonged exposure to this underlying curl. The direction of this bias differs from the previously reported simulated-rotation bias (see supplementary videos S1 and S2): when observers viewed flow consistent with linear walking while fixating an eccentric ground target, perceived heading shifts opposite to gaze (left fixation leads to rightward heading, and vice versa). The magnitude of this bias matches the curl expected by gaze-centred flow geometry, and cancelling curl eliminates the bias. A simple controller based on mean image curl reproduces heading behaviour across path conditions. Finally, we present a neurally plausible implementation of this controller illustrating how parietal circuits can transform local curl near gaze into global heading estimates via recurrent dynamics and competition between sensory evidence and priors—without explicit FOE extraction. This reframes retinal curl from a nuisance component to a functional control signal for heading.
Methods
Participants
We tested 12 participants (five self-identified men and seven self-identified women), aged between 24 and 59 years (mean 30 SD: 9), all with normal or corrected-to-normal vision. Except for one, all participants were naïve to the aims of the study and volunteered to take part. The study forms part of an ongoing research program approved by the Ethics Committee of the University of Barcelona and conducted in accordance with the principles of the Declaration of Helsinki.
Displays and conditions
Participant motion was simulated as a translation parallel to the ground plane at a sustained walking speed of approximately 1 m/s. This motion incorporated characteristic bounce and swing components derived from a single gait profile. This profile was recorded once by an independent individual wearing an HMD tracker while walking along the predefined experimental paths in a virtual environment, and then was applied to all participants to ensure stimulus consistency. Each trial lasted between 11 and 12 s.
The ground plane consisted of a 50 × 50 m surface mapped with a naturalistic texture (see Fig. 1A) generated from simplex noise patterns whose spatial power spectrum followed a 1/f 2 distribution. The temporal frequency of these patterns generated by simulated self-motion is consistent with the statistics of natural videos as described by (Dong and Atick, 1995) also following a 1/f-type temporal power spectra (exponent of 2). The textures were created in real time using OpenGL shaders within a custom Python program, designed for computational efficiency and allowing online manipulation of the texture in specific experimental conditions (see Flow manipulation conditions below).

Ground texture, trajectories and retinal curl distributions across conditions.
(A) Snapshot of the ground texture based on simplex noise. Yellow lines indicate optic flow vectors computed using the Farnebäck algorithm. A clear rotational component (curl) is visible, consistent with the observer looking at a point located to the left of the simulated path (yellow dot). (B) Schematic of the three trajectories. Position (0, 0) represents the starting point of simulated locomotion. The five colored dots mark the fixation points in world coordinates at the beginning of each trial (20 m ahead of the observer). (C–E) Distributions of mean retinal curl across trials for left-, center-, and right-gaze conditions, respectively. Dark filled bars indicate the unaltered curl condition, while lighter bars and outlined steps represent the cancelled curl condition. The vertical grey line denotes zero curl.
The experiment was run on an Intel i7-based workstation (i7-9700F, Intel, Santa Clara, CA, USA) equipped with an NVIDIA GeForce RTX 2060 SUPER GPU. Images were rendered at 120 Hz with a resolution of 1920 × 1080 pixels and displayed monocularly via a PROPixx projector (VPIxx Technologies, Saint-Bruno, QC, Canada) onto a back-projection screen (2.03 m × 1.16 m), viewed from a distance of 1.0 m, resulting in a visual field of approximately 91º.
To verify fixation, eye movements were recorded with a Pupil Labs Core (Berlin, Germany) eye tracker operating at 200 Hz. Trials were discarded if the median distance between gaze position and the fixation point exceeded 3º.
Path conditions
Participant trajectories could be either straight (length of about 6.5 m) or curved to the left or right (see Fig. 1B). The curved paths resulted in a final heading of ±45° relative to the initial heading (0°) in world coordinates. The three path types (straight, left, and right) were interleaved on a trial-by-trial basis.
Eccentricity conditions
In all trials, a fixation point (yellow dot in Figure 1A) was presented on the ground and remained fixed in world coordinates. Relative to the participant’s initial position (x = 0, z = 0) and heading (0°), the fixation point could appear at one of five lateral positions or eccentricities, x = {−4, −2, 0, 2, 4} m, and was initially located 20 m ahead (see colored dots in Fig. 1B). As simulated self-motion progressed, the fixation point appeared to approach the observer, necessitating a gradual increase in gaze angle. Participants were not explicitly instructed to use specific eye or head movements to maintain fixation; instead, they were permitted to engage the head-eye system naturally to track the target. Under perfect fixation, the expected head/eye rotation rate increased for the largest eccentricity from about 0.4 º/s to 0.8°/s (see Figure S2 in the SI). The eccentricity condition was randomized across trials.
Flow (curl) manipulation conditions
When fixating a stationary eccentric point while translating straight ahead, the retinal image contains an expected rotational component— referred to as curl—around the fovea. This effect is illustrated by the flow lines in Fig. 1A, which shows a snapshot of the retinal image consistent with walking straight ahead while fixating a point located to the left (positive curl). Figures 1C–E display the mean curl distributions for fixation points at varying eccentricities: leftward positions (−4 m, −2 m) produced positive curl, the center (0 m) resulted in near-zero curl, and rightward positions (2 m, 4 m) produced negative curl.
To compute the curl shown in Figs. 1C–E, we simulated the experimental trials and generated the corresponding videos (downsampled to 800 × 452 pixels at 30 Hz). Optic flow was then computed using the Farnebäck algorithm (calcOpticalFlowFarneback, OpenCV), which gives us an estimate of image motion for every pixel (x, y) at each frame t. The curl was extracted for each frame sequence from the image 2D flow:

where u and v are the horizontal and vertical flow components respectively (in pixels per frame). We computed the mean image curl 
In some conditions, this expected curl was counteracted (cancelled curl condition) by adding an equal and opposite rotational component centered on the fixation point. Figure 1C,E also show the corresponding curl histograms (lighter colors), which are centered at zero—matching the distribution obtained when fixating straight ahead in the direction of motion (mean zero curl, Figure 1D). In an additional set of trials, the imposed counter-rotation was doubled to produce an overcancelled condition, in which the curl was reversed beyond neutralization (histograms not shown in Figure 1).
Procedure
Participants continuously reported their perceived direction of self-motion while maintaining fixation on the yellow dot. The response was given using a rotative encoder as input device, operated like a steering wheel, to align its orientation with the perceived heading direction. The device had an angular resolution of 0.3°. At the beginning of each session, participants completed a five-point calibration procedure for the eye tracker. Each session comprised 45 trials (3 trajectory types × 5 fixation positions × 3 flow manipulation conditions), and each participant completed a total of 10 sessions.
Computation of perceived path
To reconstruct the participant’s estimated path from the angular response, we treated the reported heading angle θt (radians) as specifying the instantaneous lateral slope of the perceived trajectory at frame t. For each time step, we computed the incremental simulated forward displacement

and converted the response angle into a lateral increment via

The estimated lateral position was then obtained by forward integration starting from the initial position x0:

This yields the sample-wise reconstructed lateral trajectory xt consistent with the participant’s angular responses.
Real-Time Control Model (The controller)
We interpret the reported path as the output of a curl–driven feedback mechanism inspired by point attractor models used previously in steering control (Fajen and Warren, 2007; Wilkie and Wann, 2003a). Let θt denote the heading direction (direction of forward velocity in world coordinates) and ψt the gaze direction. We consider their difference

as a instantaneous heading error. The mean optic–flow curl 



so that positive curl induces a leftward turn (increasing θ) and negative curl induces a rightward turn. If gaze is held fixed (e.g. steady fixation), 

This is a simple first-order linear differential equation, and the heading error decays exponentially with a correction time constant 1/Kp:

Thus, heading is continuously steered toward gaze until curl vanishes. In our reconstruction we integrate this controlled heading forward in time to generate the perceived trajectory implied by the observed curl (see video S3 in the SI).
Trajectory fits
To test if perceived trajectories can be accounted for by the curl signal, we used the controllerdefined in Equation 5 which we integrated forward with an explicit Euler step to obtain the estimated heading θ. With sampling interval Δt, constant forward speed (v), and initial state (θ0, x0, z0) taken from the first sample of each trial, the discrete update is

This generates a predicted 2D path 


where

is the observed path.
We fix Kc (Equation 5) to 1, absorbing the unknown curl-to-angle scale into Kp.
We used two fitting approaches: (1) independent or separate fits, in which parameters were fitted independently for each combination of heading, gaze eccentricity, and retinal flow manipulation and (2) join fits, in which a single set of parameters is used across all heading and gaze eccentricity conditions but they were different for each flow manipulation. Each approach was tested with two parameters (Kp, and the forward velocity, v). The motivation for including forward velocity in the fit was to compensate for variations in the timing of perceived heading responses. For the separate fits, the number of parameters was 30 (for the 2-parameter approach, Kp and v). In contrast, the joined approach used only 2 parameters for all heading and eccentricity conditions.
In order to compare the different models, we employed information criteria adapted for distance-based model comparison. For each model, we computed a surrogate log-likelihood based on the normalized alignment cost:

where D represents the total DTW distance and n the number of observations, with D/n representing the average alignment cost per observation. This normalization ensures appropriate scaling for information criteria computation. We then computed the Akaike Information Criterion as:

where k represents the number of parameters.
Results
All the data and code for the analysis are available at osf link. First, we checked the quality of fixation. The median trial-based deviation from the fixation dot ranged from 1.02∘ to 1.55∘ across subjects. On average, the gaze-to-fixation-dot distance exceeded 3∘ in only 0.46% of the trials which were removed from further analysis.
Figure 2 shows the instant perceived heading in the different gaze and paths conditions in which retinal flow was not altered. The thin traces show this pattern for individual observers, and the thick coloured traces show the group means. Figure 2 illustrates a clear and systematic lateral bias in perceived heading as a function of the inital gaze direction. When observer-simulated motion is straight but fixated an eccentric point on the ground (central vertical panels, blue and green lines), their reported instantaneous heading consistently shifted away from the fixation point: leftward fixation led to rightward heading reports, and rightward fixation produced the opposite pattern. The perceived final lateral displacement is significantly different from 0 (straight ahead, t(11)=5.172, p<0.001) and the magnitude of the bias was not different between sides (F(1,35)<1, p=0.4). This effect was present both when gaze was displaced by 2 m (top row) and 4 m (bottom row), and the effect scales with the lateral distance of the fixation point from the trajectory path (the final lateral distance between 2 m and 4 m condition was marginally significant, F(1,35)=3.49, p=0.07). As we shall see below, the magnitude of the bias is well accounted for by the present mean curl.

Perceived heading
Reported instant directions are plotted when fixating eccentric points on the ground. Rows: fixation 2 m (top) and 4 m (bottom) to the side. Columns: different physical path conditions (centre column = straight path). Colours: initial gaze direction (left / centre / right). Note that on curved paths gaze might change the direction sign relative to heading (i.e. left curve and initially fixating on the left). Thick coloured lines denote the mean across observers. Thin coloured lines: denote individual observers. Dark grey: physical paths. Axes show lateral position (x-axis) versus depth position (y-axis)
Note that when both simulated motion and gaze are straight ahead (orange traces), there is essentially no bias: path is perceived straight. This is consistent with curl staying near zero throughout the trial, so the controller receives no systematic steering signal (see Figure 1D). In this case, heading and gaze remain aligned and perceived trajectory stays close to the physical straight path.
Although the bias is more easily interpretable in the straight ahead condition, systematic effects due to gaze direction are present in the curved paths as well. Perceived curvature was systematically underestimated when gaze was directed into the direction of the physical curve: for leftward curves this underestimation is most evident when gaze is to the left (blue traces), and for rightward curves when gaze is to the right (green traces). This follows directly from the curl dynamics: as the observer-simulated motion proceeds along the curved path, the instantaneous mean curl decreases, because the heading and gaze directions become progressively aligned (i.e. ϕ(t) → 0). Under the proportional controller we have defined above, the yaw command scales with this error, ω(t) = Kpϕ(t), hence reduced curl implies reduced corrective turning. The model, therefore, predicts a progressive flattening of the perceived path as curl dies out over time. This is exactly the pattern observed empirically, and it is symmetric for left and right curves. By contrast, when gaze is directed opposite to the direction of curvature, ϕ(t) does not collapse toward zero, curl remains sustained, and therefore no underestimation of curvature is expected or observed. This pattern is symmetric for left and right curves too.
Effects of flow manipulation
Figure 3 shows the reported instant heading for the different flow manipulations. We also show the unaltered condition (yellow-green) again for the sake of comparison. Importantly, when curl is cancelled (cyan), these biases essentially disappear – the perceived path remains much closer to the physical trajectory across gaze directions. Note that the central row (straightahead simulated motion) provides the clearest demonstration: the unaltered curl shows the previously reported lateral bias, but when curl is cancelled (counteracted) the perceived trajectory becomes near-veridical. This confirms that the mean curl signal itself – not gaze eccentricity per se – is the functional driver of the bias. In the over-cancelled condition (purple), the pattern reverses, with biases in the opposite direction. To maintain the full factorial design (Eccentricity × Flow Manipulation × Heading), we included trials where gaze was centered and straight ahead. Although no natural curl is generated at this zero-eccentricity position, we purposely introduced curl at rotation speeds corresponding to the ‘cancelled’ and ‘over-cancelled’ conditions (simulating random leftward or rightward gaze). As shown in the central panel of Figure 3, this introduced flow induced the predicted biases despite the absence of gaze eccentricity, confirming that the retinal flow pattern, rather than the physical orientation of the eyes, is the primary driver of the perceived heading shift.

Average perceived heading for each gaze condition (columns) and each physical path curvature (rows), separately for the three retinal–flow manipulation conditions: unaltered curl (yellow-green), cancelled curl (cyan), and over-cancelled curl (purple).
Shaded envelopes indicate between-observer variability (95%-CI). The right axis applies to the last column and illustrates the mean 2D displacement between observed and physical paths. This measurement indicates the mean displacement that is required for the observed path to align with the physical one. The displacement is shown for the different flow manipulations (color-coded). The error bars indicate between-observer variability (95%-CI).
The last column in Figure 3 shows the average 2D displacement (in meters) required for the observed paths to align with the physical ones. This measure gives an idea of the similarity between the reported paths and the physical exposed trajectories in the experiment. In general, the difference between perceived and physical paths is smaller when the path is straight (central panel of last column in Figure 3). Consistently with the reported paths in the different flow manipulation conditions, the displacement is smaller in the cancelled flow curl (color-coded) for all headings (different rows). This resulted in a significant quadratic effect of Flow manipulation, β = 0.09, SE = 0.023, t(526) = 3.997, p < .001, indicating that mean deviation was lower in the cancelled condition compared with the unaltered and overcancelled conditions.
Fitting the controler
To assess how well the curl-based controller accounts for reported perceived headings, we fitted the model as described in the Methods. The fit incorporates the mean image curl computed via Equation 1 from the experimental videos, assuming perfect gaze stabilization on the fixation point—a reasonable assumption, given the high quality of fixation recorded. Fits were performed for both individual participants (see Figures S4 to S15 in the SI) and the aggregate data (see Figure 4 and Figure S3 in the SI). As shown in Figure 4, the model with separate parameters captures the average perceived trajectories with remarkable accuracy. The observed instant headings (thin lines) are overlaid with the separate fits per condition (thick red lines). In these separate fits, the mean 2D deviation per step was minimal, ranging from 0.066 m in the cancelled flow condition to 0.080 m in the unaltered condition (see table S1 in the SI). Statistical analysis via ANOVA on the individual fits confirmed that while separate fits yielded a significantly smaller loss than the join approach (F (1, 55) = 206.27, p < 0.001), flow manipulation itself had only a marginal effect on fit quality (F (2, 55) = 2.69, p = 0.077). Interestingly, the cancelled condition showed a slightly smaller loss than the other two (Quadratic contrast t(55) = 2.356, p = 0.022). Although the join fit (using only two parameters for all heading and eccentricity conditions) naturally results in a larger 2D deviation, it effectively captures the core qualitative trends across the entire conditions: a) The model successfully tracks the near-straight perceived trajectories in the central row when curl is cancelled; b) The join model accurately reproduces the systematic underestimation of curvature when gaze eccentricity aligns with the direction of the curve and, c) Even with constrained parameters, the model accounts for the reversal of bias directions in the over-cancelled condition (purple lines) across different gaze eccentricities. Despite the higher deviation inherent in compromising parameters across conditions, the join model is statistically superior as indicated by its lower AIC measures (see Table S1). This suggests that a unified curl-based mechanism provides a parsimonious and robust explanation for the instant heading perception across diverse gaze and flow conditions.

Separate fits of the controller.
Average perceived heading across participants for each gaze condition (columns) and each physical path curvature (rows), separately for the three retinal–flow manipulation conditions (thiner solid lines): unaltered curl (yellow-green), cancelled curl (cyan), and over-cancelled curl (purple). The different red thicker solid lines denote the best fit of the controller. The numbers in each panel indicate the average lateral deviation per step between the fit and the observed heading. Note that for the centered gaze and straight path, only the unaltered flow condition was fitted.
Neural Network Model of Gaze-Contingent Heading Bias
We present a neural model that provides a neurophysiologically plausible implementation of the controller. The model demonstrates that the bias emerges from the interaction between gaze-modulated visual flow processing and a weak straight-ahead prior in parietal heading representation, implemented through standard cortical connectivity. After presenting the model, we show different dynamic aspects of the model and reproduce the bias for the straight ahead trajectories at different trajectories. The python source code of the model is available at osf link
Local Motion Encoding
Visual motion is encoded by direction-selective units analogous to primate MT cortex, with 8 directional preferences:

The response of each direction channel at image-plane position pi = (xi, yi) is given by rectified cosine tuning (Simoncelli and Heeger, 1998):

where vi is the local optical flow vector. In our implementation, optic flow vi is computed, as before, using the Farnebäck algorithm as a substitute for early visual processing (V1/MT complex). This provides the motion signals that the visual system would extract through V1 to MT processing. The model itself begins with the input to the 8-sector MT-like directional encoding (Equation 10).
Curl Computation (MSTd)
To quantify the rotational (curl) component of optic flow around the current gaze position, we project local flow vectors onto the tangential direction of the circle centered at gaze. For each sampled location pi, we first compute its position relative to the gaze point g:

This vector points from the gaze location to the sample point. The direction orthogonal to this vector corresponds to the local tangential direction of rotation around gaze. We obtain this unit tangential vector by rotating ri by 90° and normalizing:

Finally, the contribution of the optic flow at that point to local rotational motion is given by the projection of the flow vector vi onto this tangential direction:

Positive values indicate counterclockwise motion around gaze, and negative values indicate clockwise motion. The mean curl is computed as follows:

where S represents samples satisfying ‖ri‖ > rmin after trimming outliers. In our implementation, Ntotal = 400 samples are drawn within a gaze-centered region.
Ring Attractor Dynamics: Parietal Heading Representation
Heading direction θ is represented by a ring attractor mechanism consistent with previous studies (Zhang, 1996), with neural activity x(ϕ, t) at preferred heading ϕ evolving as:

We implement this dynamics by using a discrete ring of N = 181 units with preferred headings ϕj uniformly spaced between [−ϕmax, ϕmax]. The activity dynamics then follow:

The recurrent connectivity follows a standard Mexican hat profile (Ben-Yishai et al., 1995):

where d(ϕj, ϕk) is the circular distance between preferred headings. Ae and Ai were set to 1.9 and 1.0 respectively.
External Inputs
The external input Iext(ϕ, t) consists of two components:

The gaze-modulated inhibitory input is:

where ϕg is the gaze direction in heading coordinates, Kp is the inhibitory gain, and σκ controls the width of the gaze-centered modulation, determining how broadly the inhibition spreads around the gaze direction ϕg
The straight-ahead prior is:

I0 is the prior strength and σp sets the width of the straight-ahead prior, with smaller values producing a sharper preference for ϕ = 0
Bias generation as Sensory–Prior Competition
Within this framework, the systematic bias opposite to gaze emerges from the interaction between the gaze-centered inhibitory drive and a weak straight-ahead prior within a stabilizing recurrent Mexican-hat network. A strong prior (I0 ≳ 0.25) keeps the heading estimate near straight-ahead, while a weak prior I0 ≈ 0.03 allows sensory evidence to shift the attractor state. The symmetric recurrent dynamics maintain a coherent and stable activity bump. For leftward gaze (ϕg < 0) and forward motion 
The heading direction is decoded using population vector readout:

where N is the number of ring units (N = 181). Unlike some heading perception models (Beintema et al., 2004; Lappe and Rauschecker, 1993), we do not apply sigmoid nonlinearities to neural activities prior to decoding but a linear readout. Gaze-contingent bias emerges even with linear decoding, suggesting it is a fundamental property of the network dynamics.
Relation to the controller (phenomenological model)
The neural implementation provides a mechanistic basis for the phenomenological relationship 


Neural simulations
To demonstrate the neurophysiological feasibility of the curl-based controller, we conducted a series of neural network simulations using a custom Python implementation of the recurrent ring network, with subsequent 3D trajectory reconstruction. The model implementation read the video sequences of the experimental straight paths conditions, assuming again perfect gaze stabilization on the fixation point across all trials. As commented above, we computed the optic flow for each frame, before proceeding with each step of the neural model, from the local motion computation to the linear decoding of the ring activity.
We explored the parameter space to identify the drivers of the heading bias, varying the prior strength (I0 ∈ [0.03, 0.24]), the prior width (σp ∈ [0.18, 0.25]), the spatial extent of gaze inhibition (σk ∈ [0.02, 0.04, 0.08, 0.12]) and the inhibitory gain (Kp ∈ [0.4, 0.8]). Table S2 in the SI shows the range for the different parameters used in the simulations.
Simulation results
The network dynamics provide a clear mechanistic explanation for the systematic heading bias observed during eccentric gaze. In Figure 5, we observe the network components for a trial in which the gaze direction is 4 m to the left during forward translation. Figure 5 shows the recurrent connectivity following a standard Mexican hat profile, characterized by local excitation and broader surround inhibition. The activity bump does not settle at the objective straight-ahead (0∘) but stabilizes at a positive (rightward) heading offset, as illustrated in Figure 5. This displacement is driven by the competition shown in the mechanism panel (Figure 5): the gaze-centered inhibitory drive (red dashed line) suppresses the leftward portion of the ring, effectively ‘pushing’ the neural activity bump (black line) away from the gaze location and to the right of the straight-ahead prior (blue dotted line). The Phase Portrait (Figure 5) confirms this as a stable state; the drift dynamics 

Neural network model of gaze-contingent heading bias.
The parameters used to produce these panels are: I0 = 0.03, Kp = 0.4, σp = 0.18, σk = 0.12. (A) Ring attractor connectivity showing synaptic weight (W) as a function of relative preferred heading (Δϕ) in pixels, featuring a central excitatory peak and asymmetric inhibitory sourround. While we present results using asymmetric connectivity, no significant differences were observed between asymmetric and symmetric configurations. (B) Heatmap of neural activity across neuron preferred headings (y-axis) over the frame index (x-axis), with a dashed white line indicating the decoded population heading estimate for the trial in which gaze directed 4 m to the left. (C) Mechanism of sensory-prior competition plotting normalized amplitude against ring position in pixels, illustrating the spatial alignment of the straight-ahead prior (blue dotted line), gaze-centered inhibition (red dashed line), and the resulting neural activity bump (black solid line). (D) Phase portrait showing the change in heading (dθ/dt) versus the heading estimate (θ) in pixels, with a horizontal line at zero marking the convergence of the trajectory toward a stable fixed point The color bar denotes frame number.
From the neural model heading estimates (
These results suggest that the interplay between sensory-driven inhibition and a stabilizing straight-ahead prior within a standard Mexican-hat recurrent architecture is sufficient to account for the gaze-contingent biases observed in human heading perception
Discussion
Retinal Curl as a Functional Control Variable revealed by a bias
Our findings provide empirical evidence that retinal curl has a functional role in the perceived heading. Manipulating the retinal flow to which the participants were exposed affects the perceived heading in predictable ways. This main finding suggests that curl could be used as a control signal rather than being a “nuisance” component of the optic flow that must be filtered out. While previous work has emphasized the decomposition of flow into translational and rotational components to recover heading (Beintema et al., 2004; Heeger and Jepson, 1992; Longuet-Higgins and Prazdny, 1980), we show that the visual system exploits the curl generated by gaze stabilization to perceive heading. This is consistent with recent findings in real-world walking, where retinal curl magnitude in the fovea specifies body trajectory relative to gaze (Matthis et al., 2022). In this sense, curl alone combined with extra-retinal information (e.g. rotation of eye, or head or both relative to the body) provides heading information.
Importantly, the functional nature of the curl signal is demonstrated by our curl cancellation condition. When the naturally occurring curl was counteracted, the systematic heading bias disappeared. This confirms that the bias is not an artifact of gaze eccentricity itself, but a direct consequence of the underlying flow geometry due to sustained gaze stabilization. In addition, the flow manipulation reveals a direct link between the amount of curl and perceived heading: not only did the bias opposite to fixation disappear when the curl was cancelled or nullified but also the bias re-appeared towards fixation (opposite to the bias in the unaltered flow condition) when the curl was over-cancelled (see paths in the middle row in Figure 3). This manipulation led to counter-intuitive findings like the reported path was closer to the physical/experimental one in the cancelled condition (last column in Figure 3)
Perceiving a curved heading when exposed to a physically straight path is not new. Several studies have reported heading errors consistent with this misperception in the past (e.g. (Royden et al., 1994, 1992; van den Berg and Brenner, 1994)). A mathematical analysis based on image point positions had been put forward to explain the perceived curvature when exposed to straight paths (Royden, 1994). However, there are large differences between the bias we report and theirs. First, the biases reported in the previous cited studies appear in simulated rotation conditions and when the simulated rotation rate is larger than 1º/s. When simulated rotation rate was smaller than 1º/s heading was accurate and no different than conditions with actual eye/head rotation (Warren and Hannon, 1990). None of these two conditions apply in our study. Our retinal rotation was contingent with actual gaze rotation and the rotation rate was always less than 1º/s (see figure S2 of the SI). Moreover, the stimulus duration in all these studies was smaller than 2 or 3 seconds, while our stimulus duration was larger than 10 seconds. Our biases take some time to build up. Looking at the time course of the heading responses in our experiment, the biases show up after 3 m or 5 s (see Figure 2). Finally, most of previous studies examined heading based on a binary response after the last frame, while we used a continuous response. We believe that these differences lead to the most important difference between our heading bias and the previously reported ones: the bias direction. While our bias is in the direction opposite to fixation, the biases in previous studies were consistent with the direction of fixation caused by the simulated rotation. Since, it was sustained gaze stabilization that allowed us to reveal this bias, our findings are more consistent with time varying signal account of the use of optic flow (Burlingham and Heeger, 2020) rather than the mediation of instant flow (Cutting et al., 1992; Li and Cheng, 2011) to recover heading. We encourage the reader to experience this bias firsthand: when walking in an open space while maintaining a sustained gaze on a lateral landmark, one may notice a drift in their trajectory.
Active Steering vs. Heading Recovery
An important distinction in navigation research lies between the passive perception of heading (estimating where am I going?) and the active control of steering (adjusting where do I want to go?) (Goodridge et al., 2023; Powell et al., 2024; Tuhkanen et al., 2021; Wilkie and Wann, 2003a, 2003b, 2002). While traditional models posit that accurate heading recovery is a prerequisite for steering or controlling navigation and, given that, our experimental task requires heading, our results support the view that explicit heading estimation is not strictly necessary for control (Wilkie and Wann, 2003a). Our curl-based controller reinforces this distinction: it can successfully maintain a trajectory by simply nulling the error signal derived from retinal curl, without ever needing to calculate the instantaneous focus of expansion or resolve the ambiguity of the heading point. We reproduced the results of Wilkie and Wann (2003a) (experiment 3) of active steering towards 6 targets at different eccentricities (see Figure S17 and table S3 in the SI). We introduced the heading/gaze discrepancy in these eccentricities and by nullifying the curl the controller could find solutions very close to the empirical steering paths reported in Wilkie and Wann (2003a). This suggests that the “bias” observed in our perceived headings may reflect the operation of a control law optimized for action rather than a failure of a perceptual system designed for passive estimation.
This does not necessarily mean that the information used by the participants in Wilkie and Wann (2003a) is the curl signal. They argued that steering can be achieved through a mechanism that can combine different information in addition to retinal flow, like visual direction or perceived target location (Rushton et al., 1998). This is also a possibility in our experiment, that heading responses were based on the perceived location of our experimental fixation point. However, the different flow manipulation conditions (cancelled and over-cancelled) were able to completely change the perceived instant heading. While visual direction is undoubtedly a cue, our results indicate that retinal flow dynamics can override it. In our experiments, even when the fixation point remained eccentric (providing a stronger visual direction cue), cancelling the retinal curl eliminated the bias or induced in the other direction (over-cancelling condition). This runs counter the possibility of participants using the visual direction of the fixation point. The large field of view in our experiment (>90º) could also have contributed to exploit retinal flow to a larger extent. This suggests that behaviors previously attributed to visual direction strategies may instead be driven by the minimization of retinal curl, offering a parsimonious account of steering control that unifies fixation and flow processing (Matthis et al., 2022).
Re-evaluating the Focus of Expansion (FOE)
Overall, our results suggest, at least in simulated walking, that when rotation in the retinal flow is real and active, the system treats curl as a signal to be regulated rather than noise to be subtracted. Historically, heading perception theory has been dominated by the extraction and use of the Focus of Expansion (FOE) (Gibson, 1950; Warren et al., 1988; Warren and Hannon, 1990). However, the relevance of the FOE in natural, active vision has been increasingly questioned (Matthis et al., 2022; Muller et al., 2023). In the presence of eye movements, the retinal FOE is rarely aligned with the heading, creating a complex computational problem, known as the rotational problem (Cutting et al., 1992; Koenderink and Van Doorn, 1981; Longuet-Higgins and Prazdny, 1980; Royden et al., 1994).
Although our results support the view that explicit FOE extraction may not be required for controlling navigation or locomotion, this does not mean that FOE is render useless. Instead, the visual system may rely on different flow field structures and FOE, or divergence can be one of them (Matthis et al., 2022). For example, in more complex scenes, pseudo-FOE can lead to heading biases (Layton and Fajen, 2016) due to relative motion between objects in the background. FOE can also be more relevant in high-speed locomotion, in which momentary changes of gaze direction due to rotation of head or eyes, maybe less frequent than in walking, leading to a less disrupted head-based flow (Muller et al., 2023). This shift away from FOE-based heading is further supported by recent computational models (Layton and Browning, 2014). The continuum of observed cells (from radial to circular) in MSTd would simultaneously code curvature for trajectory estimation and heading across the neural population, with curvature encoded through the spirality of the most active cell and heading through the visuotopic location of its receptive field center.
Neural Implementation of Curl-Based Steering
We propose a relatively simple neural model that can reproduce the bias when exposed to straight ahead video scene, providing a biologically plausible bridge between the phenomenological model, the controller, and neural implementation. This neural circuit model demonstrates that gaze-contingent heading biases can emerge from fundamental principles of sensory integration in cortical networks. Therefore, standard center–surround (Mexican-hat) recurrent connectivity is sufficient to generate the observed bias. The inherent push–pull dynamics transform localized inhibition into a global shift of the activity bump. The key insight is that weak straight-ahead priors, combined with standard cortical connectivity with gaze-centered inhibition, are sufficient to explain the observed psychophysical biases. The model integrates direction-selective motion processing—analogous to area MSTd, where neurons are known to be sensitive to spiral and curl patterns (Duffy and Wurtz, 1991; Graziano et al., 1994) —with a parietal ring-attractor network. Spiral tuning also appears in VIP Schaafsma and Duysens (1996) and 7a Read and Siegel (1997), to which MSTd projects Born and Bradley (2005).
Conclusion
We conclude that the rotational component of optic flow (curl), generated during gaze stabilization, is an actively used signal to control heading. It acts as a navigational cue rather than noise, as evidenced by the elimination of steering biases when curl is experimentally cancelled. Our findings challenge the necessity of explicitly extracting the Focus of Expansion for online control of locomotion. The observed behaviors are supported by a neural model based on established properties of motion processing areas. The interaction between sensory flow inputs and internal priors within a recurrent network suffices to explain the gaze-contingent biases observed in our experimental data.
Data availability
All the data and code for the analysis are available at https://osf.io/b37rg/overview?view_only=35b748a234b7459a9e9df1b41a2d4a44
Additional files
Additional information
Funding
MEC | Agencia Estatal de Investigación (AEI) (MICIU/AEI/10.13039/501100011)
Konytessa I Zorpala
References
- Self-motion-induced eye movements: Effects on visual acuity and navigationNature Reviews Neuroscience 6:966976https://doi.org/10.1038/nrn1804PubMedGoogle Scholar
- Estimating heading during real and simulated eye movementsVision research 36:431443https://doi.org/10.1016/0042-6989(95)00122-0PubMedGoogle Scholar
- Heading detection using motion templates and eye velocity gain fieldsVision research 38:21552179https://doi.org/10.1016/S0042-6989(97)00428-8PubMedGoogle Scholar
- Circular Receptive Field Structures for Flow Analysis and Heading DetectionIn:
- Vaina LM
- Beardsley SA
- Rushton SK
- Theory of orientation tuning in vi-sual cortexProceedings of the National Academy of Sciences 92:3844–3848https://doi.org/10.1073/pnas.92.9.3844PubMedGoogle Scholar
- The mechanism of interaction between visual flow and eye velocity signals for heading perceptionNeuron 26:747752https://doi.org/10.1016/S0896-6273(00)81210-6Google Scholar
- Structure And Function Of Visual Area MtAnnual Review of Neuroscience 28:157–189https://doi.org/10.1146/annurev.neuro.26.041002.131052PubMedGoogle Scholar
- Heading representations in primates are compressed by saccadesNature Communications 8:920https://doi.org/10.1038/s41467-017-01021-5PubMedGoogle Scholar
- Mechanisms of Self-Motion PerceptionAnnual Review of Neuroscience 31:389–410https://doi.org/10.1146/annurev.neuro.29.051605.112953PubMedGoogle Scholar
- Heading perception depends on time-varying evolution of optic flowProceedings of the National Academy of Sciences 117:3316133169https://doi.org/10.1073/pnas.2022984117PubMedGoogle Scholar
- Efficient encoding of natural optic flowNetwork: Computation in Neural Systems 19:183–212https://doi.org/10.1080/09548980802368764PubMedGoogle Scholar
- Perception with an eye for motionCambridge: MIT Press Google Scholar
- Wayfinding on foot from information in retinal, not optical, flowJournal of Experimental Psychology: General 121:41https://doi.org/10.1037/0096-3445.121.1.41PubMedGoogle Scholar
- Statistics of natural time-varying imagesNetwork: Computation in Neural Systems 6:345358https://doi.org/10.1088/0954-898X_6_3_003Google Scholar
- Sensitivity of MST neurons to optic flow stimuli. I. A continuum of response selectivity to large-field stimuliJournal of Neurophysiology 65:1329–1345https://doi.org/10.1152/jn.1991.65.6.1329PubMedGoogle Scholar
- Behavioral dynamics of intercepting a moving targetExperimental brain research 180:303–19https://doi.org/10.1007/s00221-007-0859-6PubMedGoogle Scholar
- The perception of the visual worldHoughton Mifflin Google Scholar
- Computing and visualizing dynamic time warping alignments in r: The dtw packageJournal of statistical Software 31:124https://doi.org/10.18637/jss.v031.i07Google Scholar
- Fixation could simplify, not complicate, the interpretation of retinal flowVision research 41:815834https://doi.org/10.1016/s0042-6989(00)00300-xGoogle Scholar
- Error accumulation when steering toward curvesJournal of experimental psychology: human perception and performance 49:821https://doi.org/10.1037/xhp0001101PubMedGoogle Scholar
- Tuning of MST neurons to spiral motionsJournal of Neuroscience 14:5467https://doi.org/10.1523/JNEUROSCI.14-01-00054.1994PubMedGoogle Scholar
- Subspace methods for recovering rigid motion I: Algorithm and implementationInternational Journal of Computer Vision 7:95–117https://doi.org/10.1007/BF00128130Google Scholar
- Visual selectivity for heading in the macaque ventral intraparietal areaJournal of Neurophysiology 112:2470–2480https://doi.org/10.1152/jn.00410.2014PubMedGoogle Scholar
- Exterospecific component of the motion parallax fieldJournal of the Optical Society of America 71:953957https://doi.org/10.1364/JOSA.71.000953PubMedGoogle Scholar
- A model of the combination of optic flow and extraretinal eye movement signals in primate extrastriate visual cortex: Neural model of self-motion from optic flow and extraretinal cuesNeural Networks 11:397414https://doi.org/10.1016/s0893-6080(98)00013-6Google Scholar
- A neural network for the processing of optic flow from ego-motion in man and higher mammalsNeural Computation 5:374391https://doi.org/10.1162/neco.1993.5.3.374Google Scholar
- A unified model of heading and path perception in primate MSTdPLoS computational biology 10:e1003476https://doi.org/10.1371/journal.pcbi.1003476PubMedGoogle Scholar
- Sources of bias in the perception of heading in the presence of moving objects: Object-based and border-based discrepanciesJournal of Vision 16:99https://doi.org/10.1167/16.1.9PubMedGoogle Scholar
- Perceiving path from optic flowJournal of Vision 11:2222https://doi.org/10.1167/11.1.22PubMedGoogle Scholar
- Retinal flow is sufficient for steering during observer rotationPsychological Science 13:485–491https://doi.org/10.1111/1467-9280.00486PubMedGoogle Scholar
- Perception of heading during rotation: Sufficiency of dense motion parallax and reference objectsVision research 40:3873–94https://doi.org/10.1016/s0042-6989(00)00196-6PubMedGoogle Scholar
- The interpretation of a moving retinal imageProceedings of the Royal Society of London. Series B. Biological Sciences 208:385397https://doi.org/10.1098/rspb.1980.0057PubMedGoogle Scholar
- Retinal optic flow during natural locomotionPLOS Computational Biology 18:e1009575https://doi.org/10.1371/journal.pcbi.1009575PubMedGoogle Scholar
- Gaze and the control of foot placement when walking in natural terrainCurrent Biology 28:12241233https://doi.org/10.1016/j.cub.2018.03.008PubMedGoogle Scholar
- Retinal motion statistics during natural locomotioneLife 12:e82410https://doi.org/10.7554/eLife.82410PubMedGoogle Scholar
- A model of self-motion estimation within primate extrastriate visual cortexVision research 34:29172938https://doi.org/10.1016/0042-6989(94)90060-4PubMedGoogle Scholar
- Coordination of gaze and action during high-speed steering and obstacle avoidancePLoS one 19:e0289855https://doi.org/10.1371/journal.pone.0289855PubMedGoogle Scholar
- Modulation of responses to optic flow in area 7a by retinotopic and oculomotor cues in monkeyCerebral cortex (New York, NY: 1991) 7:647661https://doi.org/10.1093/cercor/7.7.647PubMedGoogle Scholar
- Analysis of misperceived observer motion during simulated eye rotationsVision research 34:32153222https://doi.org/10.1016/0042-6989(94)90085-XPubMedGoogle Scholar
- The perception of heading during eye movementsNature 360:583585https://doi.org/10.1038/360583a0PubMedGoogle Scholar
- Estimating heading during eye movementsVision research 34:31973214https://doi.org/10.1016/0042-6989(94)90084-1PubMedGoogle Scholar
- Guidance of locomotion on foot uses perceived target location rather than optic flowCurrent Biology 8:11911194https://doi.org/10.1016/S0960-9822(07)00492-7PubMedGoogle Scholar
- Neurons in the ventral intraparietal area of awake macaque monkey closely resemble neurons in the dorsal part of the medial superior temporal area in their responses to optic flow patternsJournal of Neurophysiology 76:4056–4068https://doi.org/10.1152/jn.1996.76.6.4056PubMedGoogle Scholar
- A model of neuronal responses in visual area MTVision research 38:743761https://doi.org/10.1016/S0042-6989(97)00183-1PubMedGoogle Scholar
- Visual anticipation of the future path: Predictive gaze and steeringJournal of Vision 21:2525https://doi.org/10.1167/jov.21.8.25PubMedGoogle Scholar
- Why two eyes are better than one for judgements of headingNature 371:700702https://doi.org/10.1038/371700a0PubMedGoogle Scholar
- Eye movements and optical flowJosa a 7:160169https://doi.org/10.1364/JOSAA.7.000160PubMedGoogle Scholar
- Direction of self-motion is perceived from optical flowNature 336:162163https://doi.org/10.1038/336162a0Google Scholar
- Perception of translational heading from optical flowJournal of Experimental Psychology: Human Perception and Performance 14:646https://doi.org/10.1037/0096-1523.14.4.646PubMedGoogle Scholar
- Controlling steering and judging heading: Retinal flow, visual direction, and extraretinal informationJournal of Experimental Psychology: Human Perception & Performance 29:363–378https://doi.org/10.1037/0096-1523.29.2.363PubMedGoogle Scholar
- Eye-movements aid the control of locomotionJournal of vision 3:33https://doi.org/10.1167/3.11.3PubMedGoogle Scholar
- Driving as night falls: The contribution of retinal flow and visual direction to the control of steeringCurrent Biology 12:2014–2017https://doi.org/10.1016/S0960-9822(02)01337-4Google Scholar
- Representation of spatial orientation by the intrinsic dynamics of the head-direction cell ensemble: A theoryJournal of Neuroscience 16:21122126https://doi.org/10.1523/JNEUROSCI.16-06-02112.1996PubMedGoogle Scholar
Article and author information
Author information
Version history
- Preprint posted:
- Sent for peer review:
- Reviewed Preprint version 1:
Cite all versions
You can cite all versions using the DOI https://doi.org/10.7554/eLife.110770. This DOI represents all versions, and will always resolve to the latest one.
Copyright
© 2026, Kontessa I Zorpala & Joan López-Moliner
This article is distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use and redistribution provided that the original author and source are credited.
Metrics
- views
- 0
- downloads
- 0
- citations
- 0
Views, downloads and citations are aggregated across all versions of this paper published by eLife.