Individual Differences in Internal Models Determine Scene Perception
Wed-P12-Poster III-102
Presented by: Gongting Wang
How does our brain effortlessly and reliably make sense of the visual world despite its complexity? Predictive processing theories posit that visual inputs are compared to internal models of the world. On this view, natural perception is efficient because of our accurate and detailed internal models of the environment. Despite the importance of such internal models, it remains unclear what exactly they contain and how they differ across people. In this study, we used drawing to assess participant-specific, unconstrained descriptions of internal models. In detail, we asked participants to draw typical versions of different natural scenes such as a kitchen or living room. On the group level, the object composition within the drawings was well described by the occurrence frequency of objects in a large set of natural scene photographs, as well as by the objects’ conceptual distance to the scene category in a distributional semantics model. However, individual drawings varied substantially between people. To test how these individual differences relate to differences in scene perception, we probed categorization for 3d-rendered scenes. These renders were carefully constructed to match the drawings while controlling for differences in drawing style and ability. In the subsequent categorization tasks, participants were asked to report the category of the briefly presented scene renders. Categorization performance was better for renders designed to be similar to participants’ own drawings (and thus their internal models), than to renders based on other people’s drawings. This finding suggests that individual differences in internal models explain individual differences in scene perception.
Keywords: individual differences, internal models, visual perception, scene recognition, categorization, prediction