15:00 - 16:30
Wed-A7-Talk VII-
Wed-Talk VII-
Room: A7
Chair/s:
Maximilian Achim Friehs
Mouse movements on-screen are an alternative to gaze in VR
Wed-A7-Talk VII-03
Presented by: Erwan David
Erwan David, Melissa Vo
Scene Grammar Lab, Department of Psychology, Goethe University Frankfurt, 60323 Frankfurt am Main, Germany
What if we could save on instruments, material and time by studying gaze during high-order cognitive tasks via mouse movements on screen? We ran visual search tasks in interactive 3D-modelled indoor rooms. Search targets were placed either outside or inside containers (e.g., fridge, cupboard) in 3D real-world scenes. In two phases participants first searched for outside-only targets, then for a mix of inside and outside targets. One group of participants accomplished the task in VR, another group went through it online on their personal computer, controlling the camera-view with their mouse and keyboard. We sampled eye-in-space data in one case, and mouse movements in the other. The novelty of our approach is processing mouse data like we usually do gaze: we used a gaze-parsing algorithm based on velocity and relative angles to identify pseudo-fixations and pseudo-saccades. Interestingly, the pseudo-gaze mimicked dynamics of real saccades and fixations (e.g., main sequence, distribution shapes). Behaviorally, we observed comparable results regarding visual search phases (initiation, scanning, and verification) and interactions with the environment. Effects related to searching for inside vs. outside objects were as apparent in both mouse and gaze data. We believe that the same high-order gaze dynamics, such as exploring and analyzing the scenes, translate to mouse movements. Our aim is not for experimenters to forgo eye in favor of mouse tracking, rather we wish to promote more in-depth analysis of another source of data - possibly as complex as gaze - that can easily be implemented in on-screen (online) protocols.
Keywords: data analysis, mouse, on-screen, gaze, virtual reality