Submission 482
The Temporal Dynamics of Art-Style Perception
SymposiumTalk-01
Presented by: Philipp Flieger
Looking at a photograph or van Gogh’s rendition of a night sky, we have no problem telling apart style from content. Yet, how the brain represents style and content has not been comprehensively investigated. Here, we created stylized renditions of 20 natural objects and scenes in seven different artistic styles (3D-render, cubism, expressionism, drawing, pixel art, renaissance, watercolor) using Stable Diffusion XL. In an EEG experiment, participants (N = 25) viewed the stylized images, and the original photographs used to create them, while performing an orthogonal target detection task. Representational similarity analysis revealed robust neural representations of both art style and content. As expected, we observed higher pairwise decoding accuracies between visually distinct art styles (e.g., 3D-render vs. expressionism), compared to more similar ones. Moreover, neural representations of art style peaked earlier (~100ms post-stimulus) than representations of content (~150–200ms post-stimulus), suggesting that rapidly emerging perceptual codes differentiate art styles. By assessing confusions between styles and contents, we quantified whether neural responses were dominated by style or content. Specifically, we iteratively regressed hierarchical deep neural network features out of the neural data and assessed whether the residual variance from the layer-wise regressions captured more style- or content-specific information. Our results suggest that neural style or content dominance depends on hierarchical features that vary with the abstractness of the style. In summary, we present evidence for distinct representations of art style and content, with style representations emerging early and with different representational formats for more representational and abstract styles.