Submission 663
Predictive Gestures: An EEG Study of Multimodal Affirmation and Negation
SymposiumTalk-04
Presented by: Samuel Sonntag
Human face-to-face communication encompasses far more than spoken words; it involves a rich combination of multimodal signals such as speech, gestures, facial expressions, and more. Despite this ubiquity of multimodality in communication, the precise role that gestures play in shaping understanding remains to be fully determined. Motivated by experimental evidence suggesting that multimodal information can facilitate comprehension under increased processing demands, we conducted an initial study with an experimental paradigm adapted from basic conflict tasks in cognitive psychology. We manipulated the compatibility of gestural (head shakes/nods & thumbs up/down) and verbal information (yes/no) and investigated whether negation - a linguistic universal that has proven to elicit increased processing demands - would specifically benefit from the presence of multimodal information. Interestingly, we did not find convincing evidence to support this multimodality for compensation account across several behavioral experiments. In contrast, affirmative information seemed to benefit more from the presence of multimodality than negative information. In follow-up studies, we aim to further delineate the integration of multimodal information, with particular emphasis on the function and role of gestures in speech processing and their contribution to predictive processing. The results of these studies - including electrophysiological evidence - will be discussed.