16:30 - 18:00
Wed—HZ_10—Talks9—97
Wed-Talks9
Room:
Room: HZ_10
Chair/s:
Seung-Goo Kim
Representational Gradients of Emotion-relevant Musical Information in the Human Cerebral Cortex
Wed—HZ_10—Talks9—9705
Presented by: Seung-Goo Kim
Seung-Goo Kim 1*Tobias Overath 2Daniela Sammler 1
1 Research Group Neurocognition of Music and Language, Max Planck Institute for Empirical Aesthetics, 2 Department of Psychology and Neuroscience, Duke University
Music often evokes strong emotions. Yet, how musical auditory representations are abstracted in the brain and how these different representations contribute to the emergence of felt emotions remains poorly understood. Recent work suggests that pre-trained audio convolutional neural networks (CNNs) can capture information in real-world music that is relevant to felt emotions and neural activity in the medial prefrontal cortex (Kim et al., 2023). Here, we explored (i) whether increasingly abstract representations of music in different layers of the CNN are encoded along a well-established functional gradient—from unimodal sensory to transmodal associative regions (Margulies et al., 2016), and (ii) how layer-specific CNN embeddings predict human behavioral ratings of musical emotions.

We analyzed the fMRI dataset of Sachs et al. (2020) where 37 participants listened to one ‘happy’ and two ‘sad’ instrumental musical pieces. We found a marked correspondence between the CNN layer-specific representational gradient of musical information and the gradient of the intrinsic functional connectivity of the cortex (Margulies et al., 2016) suggesting that the transformation of the auditory signal along the cortical hierarchy may involve an abstraction mechanism similar to what the CNN implements, beyond the auditory systems (cf. Giordano et al., 2023). We also found distinct encoding patterns of the ‘Emotionality’ and ‘Enjoyment’ ratings across CNN layers suggesting that basic and aesthetic emotional experiences may depend on different abstraction levels of the audio signal represented along the cortical gradient. Overall, the current analyses may open new ways to better understand the multi-layered mechanisms underlying musical emotions (Juslin, 2013).


Keywords: Computational Neuroscience, Emotions, Functional MRI, Music, Machine Learning, Modeling, Perception