Word-Driven Cerebro-Acoustic Coherence in a Natural Language Comprehension Task
Wed—Casino_1.811—Poster3—8901
Presented by: Jannika Hollmann
We investigated the effects of word lexicality and word duration on neural tracking measured via cerebro-acoustic coherence. We reanalyzed a publicly available dataset (Brennan, 2023): Participants listened to an English audiobook version of the first chapter of Alice in Wonderland while brain activity was measured via EEG. We investigated cerebro-acoustic coherence for all spoken words, content words (open class), and function words (closed class) separately. To account for the effects of word duration on neural tracking, we used the Highest Density Regions (HDR; 30%, 60%, 90%) of the temporal distribution for all words, content words, and function words. We transformed each HDR into bandpass filters, which were applied after EEG preprocessing. Cerebro-acoustic coherence was calculated separately for all words, content words, function words, and each respective filter. We found significant cerebro-acoustic coherence for content words in the high-theta and low-theta band while for function words exclusively in the high-theta band. The theta rhythm previously assumed to track especially syllables seems therefore also track words and therefore neural tracking may reflect attention to more than one unit of speech.
Brennan, J. R. (2023). EEG datasets for naturalistic listening to "Alice in Wonderland" (Version 2) [Data set]. University of Michigan - Deep Blue Data. https://doi.org/10.7302/Z29C6VNH
Keywords: neural tracking, speech perception, word type, coherence, EEG