Multisensory cues for walking in virtual reality: Humans combine conflicting visual and self-motion information to reproduce distances
Wed-A6-Talk VI-03
Presented by: Karl Kopiske
When humans walk, it is important for them to have some measure of the distance they have travelled. Typically, many cues from different modalities are available, as humans perceive both the environment around them (for example, through vision and haptics) and their own walking. Here, we investigate the contribution of visual cues and non-visual self-motion cues to distance reproduction when walking on a treadmill through a virtual environment by separately manipulating the speed of a treadmill belt and of the virtual environment. Using mobile eye tracking, we also investigate how our participants sampled the visual information through gaze. We show that -- as predicted -- both modalities affected how participants (N = 28) reproduced a distance. Participants weighed self-motion cues more strongly than visual cues, corresponding also to their respective reliabilities, but with some inter-individual variability. Those who looked more towards those parts of the visual scene that contained cues to speed and distance tended also to weigh visual information more strongly, although participants generally directed their gaze towards visually informative areas of the scene less than expected. As measured by motion capture, participants adjusted their gait patterns to the treadmill speed but not to walked distance. In sum, we show in a naturalistic virtual environment how humans use different sensory modalities when reproducing distances, and how the use of these cues differs between participants and depends on information sampling.
Keywords: locomotion, perception and action, eye movements, distance perception, virtual reality