11:00 - 12:30
Parallel sessions 8
11:00 - 12:30
Room: HSZ - N5
Chair/s:
Fritz Günther, Markus Kiefer
Embodied and grounded cognition approaches have remained enduring focal points in cognitive psychology. Although some subtle differences are sometimes postulated, both approaches converge on the assumption that cognition is essentially based on a reinstatement of processes of perception, action and introspection. How exactly the symbols that are central our higher cognition and communication, such as linguistic forms and abstracted mental representations, obtain their meaning from sensorimotor experience is one of the open challenges in this line of research. Highly interesting experimental-behavioural studies have been conducted that produced important insights. At the same time, we are experiencing the theoretical and empirical limits of this approach:
On the one hand, research on embodied cognition is often missing the formalisation, quantification, and precision required to make theoretically substantive advances – a gap to be filled with computational modelling. Here, recent work has brought forward large-scale data-driven representation models built from different data sources, such as language and vision. These allow us to exactly operationalize to what extent information from different modalities of experience shape our semantic representations, and investigate their specific influences on cognitive processes.
On the other hand, theories of embodied cognition ultimately always result in claims about processes in specific cognitive systems (shared between higher cognition and sensorimotor or introspective processing), which are hard to evaluate with purely behavioural approaches and instead require neuroscientific methods. This includes electrophysiological methods with high temporal precision, as well neuroimaging methods with high spatial resolution; together, these techniques allow us to precisely map neural processes that underpin higher cognition.
 In this symposium, we bring together recent advances integrating computational and neuroscientific approaches to embodiment research: Computational models yield precise predictions at the system level, which in turn can be tested with neuroscientific methods. The presentations in this symposium highlight the advantage of an interplay of computational and neuroscientific approaches for various fields of embodied cognition such as language, memory and semantics.
Submission 667
Exploring Large Semantic Spaces of Neural Word Representations
SymposiumTalk-02
Presented by: Ilaria Appel
Ilaria Appel 1, 2, Emma Angela Montecchiari 2, Marco Zanon 2, Davide Crepaldi 2, 3
1 Humboldt-University, Berlin, Germany
2 International School for Advanced Studies (SISSA), Italy
3 University of Pavia, Italy
The nature of our conceptual representations – how they’re grounded in our experience of the world – remains unclear. Comparing neural semantic maps with vector space models promises to offer insight; yet, large-scale studies are lacking.

We addressed this issue by building large neural spaces for 1,080 abstract and concrete words. We collected EEG data from 40 participants in a Rapid Serial Visual Presentation paradigm (RSVP) and computed each pairwise distance (N=582,660) for each electrode (N=128) and time point. We compared these neural spaces with both language- and image-based models (Word2Vec vs. ViSpa) to examine how visual and linguistic information integrates across time and space.

Initial results from five Regions of Interest (ROIs) along the ventral stream show that image-based models account best for neural maps. Notably, semantic coding seems to arise already in the occipital pole, early after word presentation (~100ms). This coding persists to the left Anterior Temporal Pole/Inferior Frontal ROI at later time points (~350ms), particularly for abstract words. Overall, these results suggest that image-based semantic models effectively capture the structure of the neural conceptual system in the ventral stream, pointing to a perceptually-grounded representation of semantics that emerges at very early processing stages. More generally, this situates the results in the framework of embodied and modal cognition, highlighting the central role of perceptual information in shaping our conceptual system.