Submission 480
Neural Time Frames: How Electrophysiological Rhythms Shape Speech Processing
SymposiumTalk-01
Presented by: Jule Nabrotzky
During speech processing, the brain must handle incoming information streams across multiple time scales, from low rates (e.g., phrases, ~1 Hz), to higher rates (e.g., syllables, 4 Hz), and even very high rates such as phonemes (~30 Hz). The brain’s electrophysiological activity may support the tracking of both acoustic and abstract information at these time scales by aligning its endogenous oscillations to the rate of incoming input. Such a mechanism could aid language processing by deriving predictions about the typical duration of linguistic units at each time scale. In this talk, I will present studies from our group that provide evidence for this theory across different time scales. At the level of phrases, acoustically cued prosodic phrase length supports the processing of upcoming sentences by sustaining oscillatory activity at the cued frequency even after the end of acoustic stimulation. Furthermore, studies from our lab suggest that rhythmicity at the syllable rate may facilitate coordination in communicative interaction by accelerating turn-taking. In these ways, the synchronization of neural oscillations may constitute an adaptive mechanism suited to efficiently process spoken language.