The role of semantically related gestures in the language comprehension of simultaneous interpreters in noise
(2024) In Language, Cognition and Neuroscience 39(5). p.584-608- Abstract
- Manual co-speech gestures can facilitate language comprehension, especially in adverse listening conditions. However, we do not know whether gestures influence simultaneous interpreters’ language comprehension in adverse listening conditions, and if so, whether this influence is modulated by interpreting experience, or by active simultaneous interpreting (SI). We exposed 24 interpreters and 24 bilinguals without interpreting experience to utterances with semantically related gestures, semantically unrelated gestures,or without gestures while engaging in comprehension (interpreters and bilinguals) or in SI (interpreters only). Tasks were administered in clear and noisy speech. Accuracy and reaction time were... (More)
- Manual co-speech gestures can facilitate language comprehension, especially in adverse listening conditions. However, we do not know whether gestures influence simultaneous interpreters’ language comprehension in adverse listening conditions, and if so, whether this influence is modulated by interpreting experience, or by active simultaneous interpreting (SI). We exposed 24 interpreters and 24 bilinguals without interpreting experience to utterances with semantically related gestures, semantically unrelated gestures,or without gestures while engaging in comprehension (interpreters and bilinguals) or in SI (interpreters only). Tasks were administered in clear and noisy speech. Accuracy and reaction time were measured, and participants’ gaze was tracked. During comprehension,semantically related gestures facilitated both groups’ processing in noise. Facilitation was not modulated by interpreting experience. However, when interpreting noisy speech,interpreters did not benefit from gestures. This suggests that the comprehension component, and specifically crossmodal information processing, in SI differs from that of other language comprehension. (Less)
Please use this url to cite or link to this publication:
https://lup.lub.lu.se/record/bbc95243-9c1c-4f3a-ad89-21df03c91e84
- author
- Arbona, Eléonore
; Seeber, Kilian G.
and Gullberg, Marianne
LU
- organization
- publishing date
- 2024-04-29
- type
- Contribution to journal
- publication status
- published
- subject
- keywords
- gesture, multimodality, simultaneous interpreting, bilingualism, second language comprehension, eye-tracking, integrated-systems hypothesis, noise
- in
- Language, Cognition and Neuroscience
- volume
- 39
- issue
- 5
- pages
- 25 pages
- publisher
- Taylor & Francis
- external identifiers
-
- scopus:85192154877
- ISSN
- 2327-3798
- DOI
- 10.1080/23273798.2024.2346924
- project
- Embodied bilingualism (a Wallenberg Scholar project)
- language
- English
- LU publication?
- yes
- id
- bbc95243-9c1c-4f3a-ad89-21df03c91e84
- date added to LUP
- 2024-04-16 22:18:49
- date last changed
- 2024-05-22 15:08:03
@article{bbc95243-9c1c-4f3a-ad89-21df03c91e84, abstract = {{Manual co-speech gestures can facilitate language comprehension, especially in adverse listening conditions. However, we do not know whether gestures influence simultaneous interpreters’ language comprehension in adverse listening conditions, and if so, whether this influence is modulated by interpreting experience, or by active simultaneous interpreting (SI). We exposed 24 interpreters and 24 bilinguals without interpreting experience to utterances with semantically related gestures, semantically unrelated gestures,or without gestures while engaging in comprehension (interpreters and bilinguals) or in SI (interpreters only). Tasks were administered in clear and noisy speech. Accuracy and reaction time were measured, and participants’ gaze was tracked. During comprehension,semantically related gestures facilitated both groups’ processing in noise. Facilitation was not modulated by interpreting experience. However, when interpreting noisy speech,interpreters did not benefit from gestures. This suggests that the comprehension component, and specifically crossmodal information processing, in SI differs from that of other language comprehension.}}, author = {{Arbona, Eléonore and Seeber, Kilian G. and Gullberg, Marianne}}, issn = {{2327-3798}}, keywords = {{gesture; multimodality; simultaneous interpreting; bilingualism; second language comprehension; eye-tracking; integrated-systems hypothesis; noise}}, language = {{eng}}, month = {{04}}, number = {{5}}, pages = {{584--608}}, publisher = {{Taylor & Francis}}, series = {{Language, Cognition and Neuroscience}}, title = {{The role of semantically related gestures in the language comprehension of simultaneous interpreters in noise}}, url = {{http://dx.doi.org/10.1080/23273798.2024.2346924}}, doi = {{10.1080/23273798.2024.2346924}}, volume = {{39}}, year = {{2024}}, }