Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

gazeMapper : A tool for automated world-based analysis of gaze data from one or multiple wearable eye trackers

Niehorster, Diederick C. LU orcid ; Hessels, Roy S. ; Nyström, Marcus LU orcid ; Benjamins, Jeroen S. and Hooge, Ignace T. C. (2025) In Behavior Research Methods 57(7).
Abstract
The problem: wearable eye trackers deliver eye-tracking data on a scene video that is acquired by a camera affixed to the participant’s head. Analyzing and interpreting such head-centered data is difficult and laborious manual work. Automated methods to map eye-tracking data to a world-centered reference frame (e.g., screens and tabletops) are available. These methods usually make use of fiducial markers. However, such mapping methods may be difficult to implement, expensive, and eye tracker-specific. The solution: here we present gazeMapper, an open-source tool for automated mapping and processing of eye-tracking data. gazeMapper can: (1) Transform head-centered data to planes in the world, (2) synchronize recordings from multiple... (More)
The problem: wearable eye trackers deliver eye-tracking data on a scene video that is acquired by a camera affixed to the participant’s head. Analyzing and interpreting such head-centered data is difficult and laborious manual work. Automated methods to map eye-tracking data to a world-centered reference frame (e.g., screens and tabletops) are available. These methods usually make use of fiducial markers. However, such mapping methods may be difficult to implement, expensive, and eye tracker-specific. The solution: here we present gazeMapper, an open-source tool for automated mapping and processing of eye-tracking data. gazeMapper can: (1) Transform head-centered data to planes in the world, (2) synchronize recordings from multiple participants, (3) determine data quality measures, e.g., accuracy and precision. gazeMapper comes with a GUI application (Windows, macOS, and Linux) and supports 11 different wearable eye trackers from AdHawk, Meta, Pupil, SeeTrue, SMI, Tobii, and Viewpointsystem. It is also possible to sidestep the GUI and use gazeMapper as a Python library directly. (Less)
Please use this url to cite or link to this publication:
author
; ; ; and
organization
publishing date
type
Contribution to journal
publication status
published
subject
keywords
Eye tracking, Wearable eye tracking, Mobile eye tracking, Eye movements, Gaze, Data quality, Head-fixed reference frame, World-fixed reference frame, Plane, Surface, Tool
in
Behavior Research Methods
volume
57
issue
7
article number
188
pages
18 pages
publisher
Springer
external identifiers
  • pmid:40461911
ISSN
1554-3528
DOI
10.3758/s13428-025-02704-4
language
English
LU publication?
yes
id
3ed33ef4-4140-4ab4-84e3-0b8df3958b90
date added to LUP
2025-06-04 20:47:52
date last changed
2025-06-17 09:23:19
@article{3ed33ef4-4140-4ab4-84e3-0b8df3958b90,
  abstract     = {{The problem: wearable eye trackers deliver eye-tracking data on a scene video that is acquired by a camera affixed to the participant’s head. Analyzing and interpreting such head-centered data is difficult and laborious manual work. Automated methods to map eye-tracking data to a world-centered reference frame (e.g., screens and tabletops) are available. These methods usually make use of fiducial markers. However, such mapping methods may be difficult to implement, expensive, and eye tracker-specific. The solution: here we present gazeMapper, an open-source tool for automated mapping and processing of eye-tracking data. gazeMapper can: (1) Transform head-centered data to planes in the world, (2) synchronize recordings from multiple participants, (3) determine data quality measures, e.g., accuracy and precision. gazeMapper comes with a GUI application (Windows, macOS, and Linux) and supports 11 different wearable eye trackers from AdHawk, Meta, Pupil, SeeTrue, SMI, Tobii, and Viewpointsystem. It is also possible to sidestep the GUI and use gazeMapper as a Python library directly.}},
  author       = {{Niehorster, Diederick C. and Hessels, Roy S. and Nyström, Marcus and Benjamins, Jeroen S. and Hooge, Ignace T. C.}},
  issn         = {{1554-3528}},
  keywords     = {{Eye tracking; Wearable eye tracking; Mobile eye tracking; Eye movements; Gaze; Data quality; Head-fixed reference frame; World-fixed reference frame; Plane; Surface; Tool}},
  language     = {{eng}},
  number       = {{7}},
  publisher    = {{Springer}},
  series       = {{Behavior Research Methods}},
  title        = {{gazeMapper : A tool for automated world-based analysis of gaze data from one or multiple wearable eye trackers}},
  url          = {{http://dx.doi.org/10.3758/s13428-025-02704-4}},
  doi          = {{10.3758/s13428-025-02704-4}},
  volume       = {{57}},
  year         = {{2025}},
}