Integrative processing in artificial and biological vision predicts the perceived beauty of natural scenes.
Mon-H4-Talk 3-3201
Presented by: Sanjeev Nara
Previous research strongly suggests that whether we assign beauty to natural images is already decided during perceptual analysis. However, it is still largely unclear which perceptual computations give rise to the perception of beauty. Theories of processing fluency suggest that the ease of processing for a specific stimulus determines its perceived beauty. Here, we tested whether perceived beauty is related to the amount of spatial integration across an image, a perceptual computation that reduces processing demands by aggregating image elements into more efficient representations of whole images. We reasoned that increasing amounts of integration reduce processing demands in the visual system and thereby lead to an increase in perceived beauty. We quantified integrative processing in an artificial deep neural network model of vision: We compared activations between parts of the image and the whole image, where the degree of integration was quantified as the amount of deviation between activations for the whole image and its constituent parts. This quantification of integration predicted the beauty ratings assigned to images across four studies, which featured different images and task demands. Complementary fMRI recordings revealed that the integration in human visual cortex predicts perceived beauty in a similar way as integration in artificial neural networks. Together, our results establish integration as a computational principle that eases perceptual analysis and thereby predisposes the perception of beauty.
Keywords: Integration, Neuroaesthetics, DNNs, fMRI