16:30 - 18:00
Parallel sessions 3
16:30 - 18:00
Room: HSZ - N5
Chair/s:
Elisabeth Hein
In order to perceive and meaningfully interact with the world around us, our sensory systems need to interpret the incoming information. This interpretation process is well illustrated in the case of illusions. With some illusions we perceive very different things in one and same input, as for example in the famous Necker cube or “The dress”, which can be seen blue and black or white and golden. Other illusions make us perceive colors where there are none, as in the watercolor illusion, or cause-and-effect relationships and animacy with simple dots. Therefore, illusions are a wonderful tool to understand more about how perception works. In the symposium, we will look at this question using a variety of different experimental methods and very different illusions in order to learn more about different aspects of perception ranging from auditory motion perception to robotic vision. In particular, in the first talk Meike Kriegeskorte and colleagues will use auditory apparent motion to investigate which factors influence how object correspondence is established, i.e. object identity is perceived despite changes in location across time. In the second talk Shalila Freitag and colleagues will talk about EEG correlates of perceptual (un-)certainty and the role of stimulus predictability when participants observe stimuli with varying degree of ambiguity/visibility (Necker lattices and smiley faces). In the third talk Ben Sommer and colleagues will investigate perceived causality in a paradigm in which a disc can either be perceived as launching another disc or as passing across the other disc. In particular, they use visual adaptation to look at the influence of a launch or pass context on an ambiguous display. In the fourth talk Vebjørn Ekroll will use examples of magic tricks around the illusion of absence that work better than one would expect based on the method of the trick and how perception works. In the last talk Aravind Rao Battaje and colleagues will present work on whether robotic perceptual models could predict population-level and individual human responses to visual illusions, using the example of the fill-in color aftereffect and Silencing by motion.
Submission 362
Feature-Dependent Perception of Auditory Apparent Motion
SymposiumTalk-01
Presented by: Meike Kriegeskorte
Meike KriegeskorteBettina RolkeElisabeth Hein
University of Tübingen, Germany
A crucial ability of our cognition is the perception of objects and their motions. We can perceive objects as moving by connecting them across space and time, even when they are not present continuously, such as when objects are occluded. Apparent motion is an illusion, in which, given the right spatiotemporal distance, stationary flashes are perceived as moving from one location to another. This phenomenon exists not only visually but also auditorily. Examining the factors that influence apparent motion helps us to understand how objects are connected across space and time. In this study we used the Ternus display, an ambiguous apparent motion display, which can be perceived as two stimuli moving uniformly to the right (group motion) or one stimulus moving across the stationary center stimulus (element motion), depending on how the objects are connected over time. The percept is influenced by the inter-stimulus interval (ISI) and the stimulus features in the visual modality. Previous research has shown that the Ternus effect also occurs auditorily and depends on the ISI, suggesting that correspondence mechanisms may work similar across modalities. To test this idea further, we investigated whether the auditory Ternus effect depends on stimulus features by creating frequency-based or timbre-based biases. These biases were either compatible with the element motion or with the group motion percept. Our results showed an influence of this feature bias in addition to an ISI effect, suggesting that the visual and the auditory modalities use similar mechanisms to connect objects across space and time.