Integrating Electroencephalography in Virtual Reality Emergency Evacuation Scenarios
(2025) MAMM15 20251Department of Design Sciences
- Abstract
- This thesis explores how electroencephalography (EEG) can be combined with virtual reality (VR) to study people’s responses during emergency evacuation scenarios. This is demonstrated through a pilot study in which people are exposed to different types of alarm: visual (flashing lights) and auditory (alarm sounds), while their cognitive load and behavioral response are observed.
A VR fire evacuation scene was developed using Unity (version 2022.3.47f1) game engine and deployed with the HTC Vive Pro Eye headset and OpenBCI EEG equipment. A total of 21 participants took part in the pilot study. Each person experienced either a visual or an auditory alarm while their brain activity was recorded.
This study shows that EEG can be used in VR... (More) - This thesis explores how electroencephalography (EEG) can be combined with virtual reality (VR) to study people’s responses during emergency evacuation scenarios. This is demonstrated through a pilot study in which people are exposed to different types of alarm: visual (flashing lights) and auditory (alarm sounds), while their cognitive load and behavioral response are observed.
A VR fire evacuation scene was developed using Unity (version 2022.3.47f1) game engine and deployed with the HTC Vive Pro Eye headset and OpenBCI EEG equipment. A total of 21 participants took part in the pilot study. Each person experienced either a visual or an auditory alarm while their brain activity was recorded.
This study shows that EEG can be used in VR to measure how people react to emergency alarms. This exploratory study shows that sound may be more effective than
light in triggering quick responses. Future research could test combinations of alarms and use larger participant groups to confirm these results. (Less)
Please use this url to cite or link to this publication:
http://lup.lub.lu.se/student-papers/record/9196289
- author
- Ghazaryan, Anush LU and Zhao, Yuxi LU
- supervisor
- organization
- course
- MAMM15 20251
- year
- 2025
- type
- H2 - Master's Degree (Two Years)
- subject
- keywords
- Virtual Reality, VR, Electroencephalography, EEG, Evacuation
- language
- English
- id
- 9196289
- date added to LUP
- 2025-06-11 13:55:54
- date last changed
- 2025-06-11 13:55:54
@misc{9196289, abstract = {{This thesis explores how electroencephalography (EEG) can be combined with virtual reality (VR) to study people’s responses during emergency evacuation scenarios. This is demonstrated through a pilot study in which people are exposed to different types of alarm: visual (flashing lights) and auditory (alarm sounds), while their cognitive load and behavioral response are observed. A VR fire evacuation scene was developed using Unity (version 2022.3.47f1) game engine and deployed with the HTC Vive Pro Eye headset and OpenBCI EEG equipment. A total of 21 participants took part in the pilot study. Each person experienced either a visual or an auditory alarm while their brain activity was recorded. This study shows that EEG can be used in VR to measure how people react to emergency alarms. This exploratory study shows that sound may be more effective than light in triggering quick responses. Future research could test combinations of alarms and use larger participant groups to confirm these results.}}, author = {{Ghazaryan, Anush and Zhao, Yuxi}}, language = {{eng}}, note = {{Student Paper}}, title = {{Integrating Electroencephalography in Virtual Reality Emergency Evacuation Scenarios}}, year = {{2025}}, }