Submission 509
Language-Driven Spatial Memory: Evidence from Distributional Semantics
Posterwall-37
Presented by: Oksana Tsaregorodtseva
Research suggests that memory performance depends on spatial distances, either between memorized objects and salient landmarks or between the participant and the object, indicating that people flexibly rely on available spatial cues (Nett et al., 2025). Another line of research demonstrates that people utilize distributional semantic knowledge about space, as captured by Distributional Semantic Models, when evaluating distances between cities (Gatti et al., 2024) or spatial relations between objects (Louwerse & Jeuniaux, 2010). Recent evidence also suggests that spatial information derived from words lacking explicit spatial meaning can influence motor behavior and spatial intuitions (Tsaregorodtseva, Rinaldi, & Marelli, 2025). The present study combines these lines of research to test whether participants use language-derived spatial information when retrieving object locations from memory. The stimuli are objects with linguistic labels that are not overtly spatial but exhibit distributional associations toward spatial words (e.g., hammer and its association with Ahead–Behind, Close–Far, North–South, East–West). We employ the Garden Game (Nett et al., 2025), a laptop-based virtual-reality task for investigating spatial memory. In each trial, participants navigate a virtual garden, memorize two labeled objects, and later indicate their positions on either an allocentric or egocentric map. We predict higher memory performance (response times and accuracy) when linguistic information aligns with spatial cues in the environment – when a word linguistically associated with North or Ahead corresponds to the object’s position in the environment in allocentric or egocentric perspective, respectively. This would demonstrate that language-derived spatial information contributes to spatial memory retrieval.