Gesture and cognitive load in simultaneous interpreting: A pilot study
Yuetao Ren & Jianhua Wang
-
HTML version of the abstract and keywords [English]:
Title
Gesture and cognitive load in simultaneous interpreting: A pilot study
Abstract
This paper explores the relationship between gesture and cognitive load in simultaneous interpreting (SI). To this end, we set up a remote interpreting setting for data collection. Thirteen master’s student interpreters participated in two SI tasks, one in a video condition and the other in an audio condition. We analyzed their gestural behaviors and disfluency patterns, as well as the correlation and temporal relation between gestures and disfluencies. We found that interpreters gestured more in tasks with a higher cognitive load (audio interpreting), although the differences in disfluency rate and gesture rate between the two conditions were not significant. Even though the correlation between gesture and cognitive load was not significant, all the gestures in the study were produced parallel with or adjacent to processing difficulties. We conclude that gestures could be the embodied manifestation of the cognitive processes of SI and of the ‘exported load’. Furthermore, the function of each gesture type varies under cognitive load. Silent gestures (beats and metaphorics) may reflect the interpreter’s use of strategies, while the production of semantically related gestures (deictics and iconics) may be influenced by cognitive load. The results contribute to the understanding of SI as an embodied, multimodal cognitive activity.
Keywords
Simultaneous interpreting, gesture, cognitive load, disfluency, multimodality