XClose

UCL Psychology and Language Sciences

Home
Menu

Languages' transmogrifier: The role of motor cortex in processing observed iconic co-speech gestures

Abstract

Introduction

Brain regions for producing actions are also involved in processing observed co-speech gestures (Skipper et al., 2009). The question remains as to what role those regions play. We suggest that motor cortex first simulates the production of observed hand and arm movements. Because gestures typically begin before their lexical affiliates, simulation would allow the prediction of semantically associated motor programs and their attendant words. This would result in the conservation of metabolic resources because presented words would not need to be processed as fully when predictions are confirmed. Thus, we hypothesize that observed gestures that iconically represent actions will 1) activate regions involved in the production of hand and arm movements and 2) regions involved in the production of the actions portrayed by those gestures and result in 3) a decrease in brain activity when associated words occur. For example, a kicking gesture accompanying “I kicked the ball” should activate hand and arm then foot and leg motor regions prior to the word “kicked” followed by a reduction of activity in language regions when “kicked” is presented.

Methods

Participants (N=22) underwent 256-channel electroencephalography while watching clips of an actress, filmed from neck to waist, producing sentences. Co-speech gestures in these clips portrayed actions done with the hands and arms (Hand-as-Hands-Gestures), feet and legs (Hand-as-Feet-Gestures), or neither (Filler-Gestures). Audio-only clips were created by removing the video track from Gestures clips and pairing them with still images of the actress with her hands at her sides. Data were sampled at 250 Hz, 1-40 Hz band-pass filtered, and source localized using BEM forward and sLORETA inverse models. For each participant, activity was averaged over time points corresponding to hypotheses, including the time window from the end of the gesture preparation phase to the start of the associated auditory information (Pre-Target-Word-Gesture) and the target word itself (Target-Words). Group paired t-tests (ps < .05 corrected) are visualized on a template brain (Tadel et al., 2011).

Results

All co-speech gesture clips produced greater activity than audio-only clips in more dorsal and medial (PMd) and mid-ventral and lateral (PMv) aspects of the precentral gyrus and sulcus and central sulcus and anterior superior temporal cortex (STa). Hand-as-Hands-Gestures produced greater activity than Hand-as-Feet-Gestures in the PMv during the Pre-Target-Word-Gesture period. Conversely, Hand-as-Feet-Gestures produced greater activity than Hand-as-Hands-Gestures in the PMd (Figure 1). These results held when subtracting out activity for Filler-Gestures. In contrast, during Target-Words, audio-only clips produced more activity than gesture clips in, among other areas, the STa, posterior superior temporal cortex, and pars opercularis and triangularis.

Conclusions

Results support the motor simulation and prediction account of the role of motor cortex in processing observed gestures. Specifically, activity patterns suggest that participants veridically simulated observed movements. This seems to lead to the activation of motor programs associated with the semantic affiliates of those movements as demonstrated by motor somatotopy for gestures with hand (PMv) and foot (PMd) meanings. Results support a predictive mechanism in that motor activity preceded the words corresponding to gestures and resulted in a subsequent reduction in activity when those words occurred. The latter reduction may work as in the following: Motor cortex simulates kicking from a gesture portraying kicking and pre-activates the word “kicking” such that, when the “k” sound arrives in auditory cortex, the prediction is confirmed and “icking” need not to be processed (to the same extent). Thus, through these motor mechanisms, we do, in some sense, actually hear co-speech gestures during real-world communication and the result is a significant freeing of metabolic resources.

References

Skipper, J.I., Goldin-Meadow, S., Nusbaum, H.C., & Small, S.L. (2009), ‘Gestures orchestrate brain networks for language understanding’, Current Biology, 19(8), pp. 661–667.

Tadel F., Baillet S., Mosher J.C., Pantazis D., Leahy R.M. (2011), ‘Brainstorm: A User-Friendly Application for MEG/EEG Analysis’, Computational Intelligence and Neuroscience, Volume 2011, Article ID 879716, pp. 1-13.