How does the fridge’s bzzz influence your search for milk? The effect of anchor object sound on visual search
Mon—Casino_1.811—Poster1—2111
Presented by: Yuri Markov
We search for items in complex environments daily. Previous studies have shown that search behavior in real-world scenes follows scene grammar rules—statistical patterns learned from the typical placement of objects in a scene (Võ, 2019). For example, a sponge is often found near a sink. These large, stationary objects referred to as anchors in the scene grammar framework tend to predict the identity and location of smaller objects. However, our environment consists not only of visual information but also auditory information. A previous study found no effect of the sound of the target on visual search in real-world scenes (Seidel et al., 2018). In our study, we aimed to investigate whether the sound of anchor objects could help locate target objects. We presented the sound of anchor objects for 5 seconds, which could be either consistent (sound from an anchor object present in the scene) or inconsistent (sound from anchor objects absent from the scene). After the sound, a preview of the scene was presented for 1 second, followed by the target name for another 1.5 seconds. Then, the scene with a target was shown, but only a small area around the mouse cursor was visible. Participants were instructed to move the mouse to find the target and click on it. We found that RTs for correct answers were faster in the consistent compared to the inconsistent condition. We propose that the sound of anchor objects helps participants navigate complex, realistic environments, making it easier to locate the target.
Keywords: visual search, scene grammar, multisensory integration