Is Perceiving Speech Perceiving Gestures? The Role of the Motor System in Phoneme Perception

Viktoria Groiss

Language is not only speaking, but also perceiving what is spoken.

Traditionally, these two processes have been attributed to different areas of the brain. However, the sounds we produce when speaking and the sounds we process when listening are inextricably linked, if only because we perceive our own voice at the same time as we speak. Today, we therefore assume less strictly separate brain areas and assume potentially overlapping networks for the two processes.

In the study “Is Perceiving Speech Perceiving Gestures?”, we investigated whether our articulation organs (lips, tongue, etc.) and the associated brain areas play a role not only in the production but also in the perception of individual speech sounds (“phonemes”). For this purpose, we asked test subjects to hold a wooden stick in their mouth and press it upwards with their tongue so that it roughly corresponds to the pronunciation of “ü”. The test subjects were then presented with sounds that lay between two typical speech sounds (e.g. between “i” and “ü”). Our test subjects then pressed a button to indicate whether they perceived one or the other speech sound. From this data, we were able to calculate a “phoneme boundary”, i.e. the point at which perception shifts from one speech sound to the other.

The comparison of these phoneme boundaries with those measured without or with other distractions showed no significant difference. However, the reaction times of the test subjects were slower when they had to hold the wooden stick in their mouths. It is therefore unlikely that the articulatory areas of the brain play a noticeable role in speech perception. However, it is conceivable that they can “disrupt” the perception process to such an extent that it results in a longer processing time.