11:00 - 12:30
Parallel sessions 8
11:00 - 12:30
Room: HSZ - N3
Chair/s:
Barbara Kaup, David Dignath
This symposium examines the interplay between linguistic and non-linguistic cognition. While some cognitive functions appear to depend on language, others seem rather independent of it and many more integrate both aspects. In psychology, however, the distinction between linguistic and non-linguistic cognition is rarely made explicitly and there is currently no consensus on how language may shape, enable or constrain thought.

The symposium brings together perspectives from cognitive research, developmental psychology and animal cognition to address three questions:

(1) How are language and thought related?
(2) Which cognitive functions are inherently linguistic, and which are not?
(3) To what extend can language modulate domains traditionally considered non-linguistic?

Part 1 of the double symposium brings together comparative and ontogenetic perspectives, focusing on animal cognition and human development. (see detailled description there)

Part 2 adopts a cognitive psychology perspective. First, Carolin Dudschig examines common mechanisms in linguistic and non-linguistic processing by means of electrophysiological investigations. Rasha Abdel-Rahman's contribution addresses the question of whether language influences the formation of visual representations. Senne Braem investigates how semantic knowledge guides learning of new tasks. Tally Miller tests the influence of verbal labels on the categorization of musical stimuli. Finally, Günther Knoblich discusses the role of linguistic and non-linguistic cognition in joint action.
At the end of the double symposium, philosopher Hong Yu Wong will integrate these diverse perspectives in a concluding discussion, aiming to clarify when, whether, and how cognition harnesses the faculty of language.
Submission 710
Harnessing Semantic Networks for Efficient Task Learning
SymposiumTalk-03
Presented by: Senne Braem
Senne BraemMina Habibi
Department of Experimental Psychology, Ghent University, Belgium
Humans are remarkably efficient at learning new tasks, in large part by relying on the integration of previously learned knowledge. However, most of our research on cognitive control typically happens in experiments using abstract and non-tangible stimuli that do not rely on participants' existing semantic knowledge. Here, I will present a series of experiments were we demonstrate how existing, semantically rich distinctions allow for a more robust learning of novel task information. Specifically, through both behavioral analyses and fitting neural network models, we show how pre-existing semantic structures might particularly help with creating more separated task representations, that are more resistant to catastrophic forgetting and can be repurposed to other tasks. Next, I will show how through the use of spatial arrangement tasks, we can tap into pre-existing semantic structures and show how individual differences in sensitivity to semantic dimensions can predict the learning and embedding of entirely new tasks. Finally, I will demonstrate how we can leverage this paradigm to test contemporary theories on the development of cognitive control in prepubescent children. We believe our work helps towards a more integrative understanding of how cognitive control functions are learned and develop through interactions with semantic cognition during novel task learning.