Practical Saccade Prediction for Head-Mounted Displays: Towards a Comprehensive Model

05/03/2022
by   Elena Arabadzhiyska, et al.
0

Eye-tracking technology is an integral component of new display devices such as virtual and augmented reality headsets. Applications of gaze information range from new interaction techniques exploiting eye patterns to gaze-contingent digital content creation. However, system latency is still a significant issue in many of these applications because it breaks the synchronization between the current and measured gaze positions. Consequently, it may lead to unwanted visual artifacts and degradation of user experience. In this work, we focus on foveated rendering applications where the quality of an image is reduced towards the periphery for computational savings. In foveated rendering, the presence of latency leads to delayed updates to the rendered frame, making the quality degradation visible to the user. To address this issue and to combat system latency, recent work proposes to use saccade landing position prediction to extrapolate the gaze information from delayed eye-tracking samples. While the benefits of such a strategy have already been demonstrated, the solutions range from simple and efficient ones, which make several assumptions about the saccadic eye movements, to more complex and costly ones, which use machine learning techniques. Yet, it is unclear to what extent the prediction can benefit from accounting for additional factors. This paper presents a series of experiments investigating the importance of different factors for saccades prediction in common virtual and augmented reality applications. In particular, we investigate the effects of saccade orientation in 3D space and smooth pursuit eye-motion (SPEM) and how their influence compares to the variability across users. We also present a simple yet efficient correction method that adapts the existing saccade prediction methods to handle these factors without performing extensive data collection.

READ FULL TEXT

page 5

page 7

page 9

page 13

page 15

page 16

page 18

page 20

research
10/20/2022

Slippage-robust Gaze Tracking for Near-eye Display

In recent years, head-mounted near-eye display devices have become the k...
research
05/24/2019

Would Gaze-Contingent Rendering Improve Depth Perception in Virtual and Augmented Reality?

Near distances are overestimated in virtual reality, and far distances a...
research
12/20/2016

A Statistical Approach to Continuous Self-Calibrating Eye Gaze Tracking for Head-Mounted Virtual Reality Systems

We present a novel, automatic eye gaze tracking scheme inspired by smoot...
research
07/12/2023

Assessing Augmented Reality Selection Techniques for Passengers in Moving Vehicles: A Real-World User Study

Nowadays, cars offer many possibilities to explore the world around you ...
research
07/17/2017

Photosensor Oculography: Survey and Parametric Analysis of Designs using Model-Based Simulation

This paper presents a renewed overview of photosensor oculography (PSOG)...
research
09/15/2020

Optical Gaze Tracking with Spatially-Sparse Single-Pixel Detectors

Gaze tracking is an essential component of next generation displays for ...
research
02/02/2021

A privacy-preserving approach to streaming eye-tracking data

Eye-tracking technology is being increasingly integrated into mixed real...

Please sign up or login with your details

Forgot password? Click here to reset