Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Gaze-based attention network analysis in a virtual reality classroom

Stark, Philipp LU ; Hasenbein, Lisa ; Kasneci, Enkelejda and Göllner, Richard (2024) In MethodsX 12.
Abstract

This article provides a step-by-step guideline for measuring and analyzing visual attention in 3D virtual reality (VR) environments based on eye-tracking data. We propose a solution to the challenges of obtaining relevant eye-tracking information in a dynamic 3D virtual environment and calculating interpretable indicators of learning and social behavior. With a method called "gaze-ray casting," we simulated 3D-gaze movements to obtain information about the gazed objects. This information was used to create graphical models of visual attention, establishing attention networks. These networks represented participants' gaze transitions between different entities in the VR environment over time. Measures of centrality, distribution, and... (More)

This article provides a step-by-step guideline for measuring and analyzing visual attention in 3D virtual reality (VR) environments based on eye-tracking data. We propose a solution to the challenges of obtaining relevant eye-tracking information in a dynamic 3D virtual environment and calculating interpretable indicators of learning and social behavior. With a method called "gaze-ray casting," we simulated 3D-gaze movements to obtain information about the gazed objects. This information was used to create graphical models of visual attention, establishing attention networks. These networks represented participants' gaze transitions between different entities in the VR environment over time. Measures of centrality, distribution, and interconnectedness of the networks were calculated to describe the network structure. The measures, derived from graph theory, allowed for statistical inference testing and the interpretation of participants' visual attention in 3D VR environments. Our method provides useful insights when analyzing students' learning in a VR classroom, as reported in a corresponding evaluation article with N = 274 participants.
•Guidelines on implementing gaze-ray casting in VR using the Unreal Engine and the HTC VIVE Pro Eye.
•Creating gaze-based attention networks and analyzing their network structure.
•Implementation tutorials and the Open Source software code are provided via OSF: https://osf.io/pxjrc/?view_only=1b6da45eb93e4f9eb7a138697b941198.

(Less)
Please use this url to cite or link to this publication:
author
; ; and
publishing date
type
Contribution to journal
publication status
published
subject
keywords
Network analysis, Eye tracking, Virtual Reality, Visual attention, Graph theory
in
MethodsX
volume
12
article number
102662
publisher
Elsevier
external identifiers
  • scopus:85189352809
  • pmid:38577409
ISSN
2215-0161
DOI
10.1016/j.mex.2024.102662
language
English
LU publication?
no
additional info
© 2024 The Authors.
id
ebec0488-bf01-4b52-b1ca-76ad459df7a8
date added to LUP
2024-10-15 08:43:50
date last changed
2025-06-12 00:01:06
@article{ebec0488-bf01-4b52-b1ca-76ad459df7a8,
  abstract     = {{<p>This article provides a step-by-step guideline for measuring and analyzing visual attention in 3D virtual reality (VR) environments based on eye-tracking data. We propose a solution to the challenges of obtaining relevant eye-tracking information in a dynamic 3D virtual environment and calculating interpretable indicators of learning and social behavior. With a method called "gaze-ray casting," we simulated 3D-gaze movements to obtain information about the gazed objects. This information was used to create graphical models of visual attention, establishing attention networks. These networks represented participants' gaze transitions between different entities in the VR environment over time. Measures of centrality, distribution, and interconnectedness of the networks were calculated to describe the network structure. The measures, derived from graph theory, allowed for statistical inference testing and the interpretation of participants' visual attention in 3D VR environments. Our method provides useful insights when analyzing students' learning in a VR classroom, as reported in a corresponding evaluation article with N = 274 participants. <br/>•Guidelines on implementing gaze-ray casting in VR using the Unreal Engine and the HTC VIVE Pro Eye.<br/>•Creating gaze-based attention networks and analyzing their network structure.<br/>•Implementation tutorials and the Open Source software code are provided via OSF: https://osf.io/pxjrc/?view_only=1b6da45eb93e4f9eb7a138697b941198. </p>}},
  author       = {{Stark, Philipp and Hasenbein, Lisa and Kasneci, Enkelejda and Göllner, Richard}},
  issn         = {{2215-0161}},
  keywords     = {{Network  analysis; Eye tracking; Virtual Reality; Visual attention; Graph theory}},
  language     = {{eng}},
  publisher    = {{Elsevier}},
  series       = {{MethodsX}},
  title        = {{Gaze-based attention network analysis in a virtual reality classroom}},
  url          = {{http://dx.doi.org/10.1016/j.mex.2024.102662}},
  doi          = {{10.1016/j.mex.2024.102662}},
  volume       = {{12}},
  year         = {{2024}},
}