Binding Music: Integration of two-tone chords into event files
Mon-HS2-Talk II-04
Presented by: Katrin Köllnberger
The ability to perceive an object as one coherent representation is due to binding processes between its features. Empirically, such binding processes can be measured via partial repetition costs, a performance pattern of faster reaction times when either all features of a given object repeat or switch as compared to the repetition (or switch) of only one feature. Feature binding has been shown for a large number of features in the visual and auditory domain. The purpose of the present two experiments was to investigate whether such binding effects can also be found in the domain of music. More precisely, we aimed to examine whether the tones of a two-tone chord are temporarily integrated into a music event-file. In the first experiment, we applied a pitch classification task. The auditory stimulus consisted of two simultaneous tones (one out of two upper tones of different pitch, and one out of two lower tones of different pitch). Participants responded with a left or right keypress to the pitch of the upper tone. The two-tone chord was always consonant. The lower tone was irrelevant but could also be low or high. Analyses of reaction times and error rates revealed partial repetition costs indicating binding: performance was better when both tones repeated or alternated relative to partial repetitions (only the upper or the lower tone repeated). The results thus show that two consonant tones are integrated into one event-file. In a second experiment, we found that this also holds true for dissonant harmonies.
Keywords: feature-binding, partial-repetition costs, music perception, binding and retrieval, action control