Cross-modal semantic influences on visual search efficiency: Semantically congruent auditory primes guide visual attention
Tue—HZ_11—Talks6—6402
Presented by: Timea Folyi
When searching for visual targets in complex and cluttered scenes, sounds can be highly effective in conveying target-related information due to the unique characteristics of auditory perception. Accordingly, several lines of evidence demonstrated that task-irrelevant sounds can be beneficial for visual search when they are meaningfully related to the search target, yielding a reaction time benefit. The aim of the present study was to investigate the processes underlying this benefit. In three experiments varying stimulus complexity and number of potential targets, participants were presented with auditory primes that were semantically congruent, neutral, or incongruent with the visual search target. Importantly, we varied the set size of the search displays, which allowed us to test the influence of priming on visual search slopes. Priming effects in a visual search context might be explained by post-search processes (e.g., facilitation of target encoding; McNamara, 2013), which would predict a set size-independent priming effect. Alternatively, auditory priming might serve as a source of guidance for visual attention (e.g., Wolfe, 2021) toward the primed target, leading to higher search efficiency with congruent priming. Visual search became more efficient when the task-irrelevant sound was congruent with the target compared to incongruent trials, as indicated by flatter search slopes. The results support that semantically congruent auditory primes can guide visual attention, and the potential mechanisms that influence attentional guidance are discussed in terms of multiple-target search. The results contribute to our understanding of how auditory cues can rapidly and involuntarily influence visual search behavior.
Keywords: attention, visual search, priming, cross-modal