Causal Inference in Visuotactile Temporal Judgements
Mon—HZ_13—Talks2—1603
Presented by: Bora Celebi
We integrate or segregate signals received from different senses according to their spatial and temporal relations. When signals are close in time and location, they are more likely to be integrated. The Bayesian causal inference framework provides a robust theoretical model for understanding these processes. It posits that the brain integrates sensory signals when they are perceived as arising from a common cause while segregating them otherwise. Thus far, the model has been predominantly applied to localization judgments. Here, we studied it in the context of temporal judgments. In two experiments, we investigated (1) the role of spatial congruency on visuotactile simultaneity judgments, and (2) how tactile signals influence visual duration judgments when they are temporally synchronous or not. Participants were immersed in a VR environment where tactile stimuli were delivered to their real hands. The corresponding visual stimuli were presented on a virtual representation of their hands, enabled through hand tracking. The temporal binding window widened with spatial congruency, possibly due to an enhanced integration process. Further, perceived visual durations were shifted towards the tactile interval when the two signals were temporally synchronized. The Bayesian causal inference framework successfully modeled the effects of spatial congruence and temporal proximity on these multisensory temporal judgments.
Keywords: multisensory integration, time perception, visuotactile, causal inference