How pointing gestures inform visual search
Wed-Main hall - Z2b-Poster 3-8803
Presented by: Oliver Herbort
People often point to refer to distal objects. In such situations, an observer of the gesture needs to search the vicinity of the pointed-at position for a matching object. We tested the hypothesis that observers encode the uncertainty associated with the perception of a pointing gesture to constrain the area in which they search for the target to a metrically defined region. Consequently, we predicted, that the density of the search display affects response times, but not the size of the fixated area. In a series of three VR experiments, participants were asked to search for a target shape, at which a virtual person pointed. In Experiment 1, the target was always present, faithfully indicated by the pointer, and the predictions were mostly confirmed. In Experiments 2 and 3, unbeknownst to our participants, the target was sometimes absent, but participants were allowed to cancel the search (Experiment 2) or indicate that the pointer was not pointing at the target (Experiment 3). In the target absent trials of these experiments, the size of the searched area was smaller when the search display was denser. The experiments reveal three important aspects about how pointing informs visual search. First, pointing guides attention to specific locations on a search display. Second, pointing constrains visual search to a subset of the search display. Third, the size of this subset does not directly correspond to any uncertainty that may have been associated with the pointing gesture.
Keywords: Pointing, Gesture, Visual Search, Uncertainty, Communication