Locomotion: How vision shapes the paths we choose

A mathematical model can predict the path walkers take through a rugged landscape, including the tendency of people to avoid paths that are too steep, even if it means going farther.
  1. Arthur D Kuo  Is a corresponding author
  1. Faculty of Kinesiology and Department of Biomedical Engineering, University of Calgary, Canada

When walking, our brain needs to make decisions, such as where to step and which path to follow. Visual inputs are central to this process, as they allow us to continuously estimate our motion and orientation, as well as regulate our direction and speed (Warren et al., 2001).

This continuous regulation can be explained using mathematical models that rely on vector calculus (Matthis et al., 2022). However, walking also involves discrete choices – such as selecting which available foothold to use, or turning left or right – that are guided by visual features of the path ahead, such as the rockiness and steepness of the terrain (Matthis et al., 2018). While it may seem obvious that people avoid rugged paths when easier alternatives exist, there is no mathematical model to explain this behavior. Now, in eLife, Mary Hayhoe and colleagues – including Karl Muller as first author – report the results of a study that sheds new light on how these decisions are made (Muller et al., 2024).

The researchers analyzed data from previous studies in which they had tracked the eye and body movements of nine individuals as they walked across natural terrains (Matthis et al., 2022; Bonnen et al., 2021). Participants also wore eye-tracking cameras that captured visual information about where they were looking within the surrounding physical landscape. Muller et al. used this data to create a three-dimensional computational reconstruction of the terrains, which they aligned to the body and eye movements of the participants. This allowed them to develop egocentric visual depth images (which estimated how the retinas of individuals detect the depth of the terrain), and to determine where participants placed their feet.

The researchers – who are based at the University of Texas at Austin, Indiana University and Northeastern University – then used these visual depth images and foothold locations to train a neural network where to step in the reconstructed terrain. The trained model successfully predicted where the human participants placed their feet better than chance, suggesting that decisions about where to step are influenced by visual depth cues. This agrees with prior work showing reduced depth cues cause walkers to shift their gaze to closer footholds, perhaps so they can devote more attention to resolving depth ambiguities (Bonnen et al., 2021).

Next, Muller et al. used their model to investigate whether walkers chose flatter, more winding paths over direct, steeper alternatives that require more energy to cross due to the uneven terrain (Darici and Kuo, 2023; Kowalsky et al., 2021). The team compared two metrics: tortuosity, which is the ratio between the length of a path and the direct distance between its start and end; and a dimensionless measure of ruggedness, equal to the average angle of the up and down steps within a path. These metrics were recorded for the actual path the walkers took, and also for simulated, alternative routes they might have taken through the same terrain.

Muller et al. found that the tortuosity of the paths chosen by the human walkers correlated with the ruggedness of the most direct simulated path through the terrain: the steeper and more rugged the direct path, the more likely a walker is to detour and take a longer, flatter route. This suggests a trade-off whereby walkers assess paths with high tortuosity to be less energetically costly than ones which are straighter but more rugged. Muller et al. also discovered that this trade-off was stronger for shorter individuals, possibly due to greater difficulty with steep steps, though this effect warrants further study (as discussed in the public reviews of the article).

Studying locomotion in natural settings is challenging, as terrains and visual features are harder to manipulate or measure compared to controlled laboratory environments. The methods created by Muller et al. open up new avenues for understanding decision-making during locomotion. However, there is much about the trade-off proposed in the study that remains unknown.

For example, controlled experiments are needed to confirm whether the ruggedness of the simulated direct paths match those experienced in real life. In addition, the human eye likely perceives other aspects of the terrain beyond just depth, and it remains to be seen whether other visual cues help evaluate different structural properties of the landscape, such as friction and the hardness of footholds. Related issues in natural locomotion, such as energy expenditure, also merit further investigation. For instance, the energetic costs of tortuous versus rugged paths still need quantification, as do temporal costs, which are also suspected to govern ecological behavior (Carlisle and Kuo, 2023).

The approach taken by Muller et al. could also be used to study people in human-designed environments, such as pedestrians crossing busy roads (do people jaywalk rather than use pedestrian crossings to save energy, time, or other costs?). Whether navigating natural landscapes or cityscapes, humans constantly create their own paths according to a variety of physical and perceptual criteria.

References

Article and author information

Author details

  1. Arthur D Kuo

    Arthur D Kuo is in the Faculty of Kinesiology and Department of Biomedical Engineering, University of Calgary, Calgary, Canada

    For correspondence
    arthurdkuo@gmail.com
    Competing interests
    No competing interests declared
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-5233-9709

Publication history

  1. Version of Record published:

Copyright

© 2024, Kuo

This article is distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 301
    views
  • 20
    downloads
  • 0
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Arthur D Kuo
(2024)
Locomotion: How vision shapes the paths we choose
eLife 13:e104965.
https://doi.org/10.7554/eLife.104965
  1. Further reading

Further reading

    1. Neuroscience
    Hyun Jee Lee, Jingting Liang ... Hang Lu
    Research Advance

    Cell identification is an important yet difficult process in data analysis of biological images. Previously, we developed an automated cell identification method called CRF_ID and demonstrated its high performance in Caenorhabditis elegans whole-brain images (Chaudhary et al., 2021). However, because the method was optimized for whole-brain imaging, comparable performance could not be guaranteed for application in commonly used C. elegans multi-cell images that display a subpopulation of cells. Here, we present an advancement, CRF_ID 2.0, that expands the generalizability of the method to multi-cell imaging beyond whole-brain imaging. To illustrate the application of the advance, we show the characterization of CRF_ID 2.0 in multi-cell imaging and cell-specific gene expression analysis in C. elegans. This work demonstrates that high-accuracy automated cell annotation in multi-cell imaging can expedite cell identification and reduce its subjectivity in C. elegans and potentially other biological images of various origins.

    1. Neuroscience
    Sharon Inberg, Yael Iosilevskii ... Benjamin Podbilewicz
    Research Article Updated

    Dendrites are crucial for receiving information into neurons. Sensory experience affects the structure of these tree-like neurites, which, it is assumed, modifies neuronal function, yet the evidence is scarce, and the mechanisms are unknown. To study whether sensory experience affects dendritic morphology, we use the Caenorhabditis elegans’ arborized nociceptor PVD neurons, under natural mechanical stimulation induced by physical contacts between individuals. We found that mechanosensory signals induced by conspecifics and by glass beads affect the dendritic structure of the PVD. Moreover, developmentally isolated animals show a decrease in their ability to respond to harsh touch. The structural and behavioral plasticity following sensory deprivation are functionally independent of each other and are mediated by an array of evolutionarily conserved mechanosensory amiloride-sensitive epithelial sodium channels (degenerins). Calcium imaging of the PVD neurons in a micromechanical device revealed that controlled mechanical stimulation of the body wall produces similar calcium dynamics in both isolated and crowded animals. Our genetic results, supported by optogenetic, behavioral, and pharmacological evidence, suggest an activity-dependent homeostatic mechanism for dendritic structural plasticity, that in parallel controls escape response to noxious mechanosensory stimuli.