Virtual reality facial emotion recognition in social environments : An eye-tracking study
(2021) In Internet Interventions 25.- Abstract
Background: Virtual reality (VR) enables the administration of realistic and dynamic stimuli within a social context for the assessment and training of emotion recognition. We tested a novel VR emotion recognition task by comparing emotion recognition across a VR, video and photo task, investigating covariates of recognition and exploring visual attention in VR. Methods: Healthy individuals (n = 100) completed three emotion recognition tasks; a photo, video and VR task. During the VR task, emotions of virtual characters (avatars) in a VR street environment were rated, and eye-tracking was recorded in VR. Results: Recognition accuracy in VR (overall 75%) was comparable to the photo and video task. However, there were some differences;... (More)
Background: Virtual reality (VR) enables the administration of realistic and dynamic stimuli within a social context for the assessment and training of emotion recognition. We tested a novel VR emotion recognition task by comparing emotion recognition across a VR, video and photo task, investigating covariates of recognition and exploring visual attention in VR. Methods: Healthy individuals (n = 100) completed three emotion recognition tasks; a photo, video and VR task. During the VR task, emotions of virtual characters (avatars) in a VR street environment were rated, and eye-tracking was recorded in VR. Results: Recognition accuracy in VR (overall 75%) was comparable to the photo and video task. However, there were some differences; disgust and happiness had lower accuracy rates in VR, and better accuracy was achieved for surprise and anger in VR compared to the video task. Participants spent more time identifying disgust, fear and sadness than surprise and happiness. In general, attention was directed longer to the eye and nose areas than the mouth. Discussion: Immersive VR tasks can be used for training and assessment of emotion recognition. VR enables easily controllable avatars within environments relevant for daily life. Validated emotional expressions and tasks will be of relevance for clinical applications.
(Less)
- author
- Geraets, C. N.W. LU ; Klein Tuente, S. LU ; Lestestuiver, B. P. ; van Beilen, M. ; Nijman, S. A. ; Marsman, J. B.C. and Veling, W.
- publishing date
- 2021-09
- type
- Contribution to journal
- publication status
- published
- subject
- keywords
- Affect, Avatars, Emotion, Emotion recognition, Eye-tracking, Virtual reality
- in
- Internet Interventions
- volume
- 25
- article number
- 100432
- publisher
- Elsevier
- external identifiers
-
- scopus:85110623695
- ISSN
- 2214-7829
- DOI
- 10.1016/j.invent.2021.100432
- language
- English
- LU publication?
- no
- additional info
- Publisher Copyright: © 2021 The Authors
- id
- b6a9b534-12c7-4259-9e8e-e1ec741af1d1
- date added to LUP
- 2024-10-21 10:45:23
- date last changed
- 2025-04-04 14:37:32
@article{b6a9b534-12c7-4259-9e8e-e1ec741af1d1, abstract = {{<p>Background: Virtual reality (VR) enables the administration of realistic and dynamic stimuli within a social context for the assessment and training of emotion recognition. We tested a novel VR emotion recognition task by comparing emotion recognition across a VR, video and photo task, investigating covariates of recognition and exploring visual attention in VR. Methods: Healthy individuals (n = 100) completed three emotion recognition tasks; a photo, video and VR task. During the VR task, emotions of virtual characters (avatars) in a VR street environment were rated, and eye-tracking was recorded in VR. Results: Recognition accuracy in VR (overall 75%) was comparable to the photo and video task. However, there were some differences; disgust and happiness had lower accuracy rates in VR, and better accuracy was achieved for surprise and anger in VR compared to the video task. Participants spent more time identifying disgust, fear and sadness than surprise and happiness. In general, attention was directed longer to the eye and nose areas than the mouth. Discussion: Immersive VR tasks can be used for training and assessment of emotion recognition. VR enables easily controllable avatars within environments relevant for daily life. Validated emotional expressions and tasks will be of relevance for clinical applications.</p>}}, author = {{Geraets, C. N.W. and Klein Tuente, S. and Lestestuiver, B. P. and van Beilen, M. and Nijman, S. A. and Marsman, J. B.C. and Veling, W.}}, issn = {{2214-7829}}, keywords = {{Affect; Avatars; Emotion; Emotion recognition; Eye-tracking; Virtual reality}}, language = {{eng}}, publisher = {{Elsevier}}, series = {{Internet Interventions}}, title = {{Virtual reality facial emotion recognition in social environments : An eye-tracking study}}, url = {{http://dx.doi.org/10.1016/j.invent.2021.100432}}, doi = {{10.1016/j.invent.2021.100432}}, volume = {{25}}, year = {{2021}}, }