Motion capture-based animated characters for the study of speech-gesture integration
(2020) In Behavior Research Methods 52(3). p.1339-1354- Abstract
- Digitally animated characters are promising tools in research studying how we integrate information from speech and visual sources such as gestures because they allow specific gesture features to be manipulated in isolation. We present an approach combining motion capture and 3D-animated characters that allows us to manipulate natural individual gesture strokes for experimental purposes, for example to temporally shift and present gestures in ecologically valid sequences. We exemplify how such stimuli can be used in an experiment investigating implicit detection of speech–gesture (a) synchrony, and discuss the general applicability of the workflow for research in this domain.
Please use this url to cite or link to this publication:
https://lup.lub.lu.se/record/f83b1030-ee90-46be-b81e-383767779fa0
- author
- Nirme, Jens LU ; Haake, Magnus LU ; Gulz, Agneta LU and Gullberg, Marianne LU
- organization
- publishing date
- 2020-06
- type
- Contribution to journal
- publication status
- published
- subject
- keywords
- Crossmodal information processing, Gesture, Speech-gesture integration, Motion Capture, gesture, multimodal information processing, Method development, Motion Capture, Crossmodal information processing, Speech-gesture integration
- in
- Behavior Research Methods
- volume
- 52
- issue
- 3
- pages
- 16 pages
- publisher
- Springer
- external identifiers
-
- scopus:85076623353
- pmid:31823225
- ISSN
- 1554-3528
- DOI
- 10.3758/s13428-019-01319-w
- language
- English
- LU publication?
- yes
- id
- f83b1030-ee90-46be-b81e-383767779fa0
- date added to LUP
- 2019-10-21 10:35:47
- date last changed
- 2023-11-19 17:00:52
@article{f83b1030-ee90-46be-b81e-383767779fa0, abstract = {{Digitally animated characters are promising tools in research studying how we integrate information from speech and visual sources such as gestures because they allow specific gesture features to be manipulated in isolation. We present an approach combining motion capture and 3D-animated characters that allows us to manipulate natural individual gesture strokes for experimental purposes, for example to temporally shift and present gestures in ecologically valid sequences. We exemplify how such stimuli can be used in an experiment investigating implicit detection of speech–gesture (a) synchrony, and discuss the general applicability of the workflow for research in this domain.}}, author = {{Nirme, Jens and Haake, Magnus and Gulz, Agneta and Gullberg, Marianne}}, issn = {{1554-3528}}, keywords = {{Crossmodal information processing; Gesture; Speech-gesture integration; Motion Capture; gesture; multimodal information processing; Method development; Motion Capture; Crossmodal information processing; Speech-gesture integration}}, language = {{eng}}, number = {{3}}, pages = {{1339--1354}}, publisher = {{Springer}}, series = {{Behavior Research Methods}}, title = {{Motion capture-based animated characters for the study of speech-gesture integration}}, url = {{http://dx.doi.org/10.3758/s13428-019-01319-w}}, doi = {{10.3758/s13428-019-01319-w}}, volume = {{52}}, year = {{2020}}, }