Neural Networks for Semantic Gaze Analysis in XR Settings

03/18/2021
by   Lena Stubbemann, et al.
0

Virtual-reality (VR) and augmented-reality (AR) technology is increasingly combined with eye-tracking. This combination broadens both fields and opens up new areas of application, in which visual perception and related cognitive processes can be studied in interactive but still well controlled settings. However, performing a semantic gaze analysis of eye-tracking data from interactive three-dimensional scenes is a resource-intense task, which so far has been an obstacle to economic use. In this paper we present a novel approach which minimizes time and information necessary to annotate volumes of interest (VOIs) by using techniques from object recognition. To do so, we train convolutional neural networks (CNNs) on synthetic data sets derived from virtual models using image augmentation techniques. We evaluate our method in real and virtual environments, showing that the method can compete with state-of-the-art approaches, while not relying on additional markers or preexisting databases but instead offering cross-platform use.

READ FULL TEXT

page 1

page 5

page 8

page 10

page 14

research
11/23/2022

Assessment of Human Behavior in Virtual Reality by Eye Tracking

Virtual reality (VR) is not a new technology but has been in development...
research
05/24/2019

Would Gaze-Contingent Rendering Improve Depth Perception in Virtual and Augmented Reality?

Near distances are overestimated in virtual reality, and far distances a...
research
03/19/2023

3D Gaze Vis: Sharing Eye Tracking Data Visualization for Collaborative Work in VR Environment

Conducting collaborative tasks, e.g., multi-user game, in virtual realit...
research
05/08/2020

OpenEDS2020: Open Eyes Dataset

We present the second edition of OpenEDS dataset, OpenEDS2020, a novel d...
research
05/09/2022

Identifying Fixation and Saccades in Virtual Reality

Gaze recognition can significantly reduce the amount of eye movement dat...
research
07/04/2022

GazBy: Gaze-Based BERT Model to Incorporate Human Attention in Neural Information Retrieval

This paper is interested in investigating whether human gaze signals can...

Please sign up or login with your details

Forgot password? Click here to reset