MPIIGaze: Real-World Dataset and Deep Appearance-Based Gaze Estimation

11/24/2017
by   Xucong Zhang, et al.
0

Learning-based methods are believed to work well for unconstrained gaze estimation, i.e. gaze estimation from a monocular RGB camera without assumptions regarding user, environment, or camera. However, current gaze datasets were collected under laboratory conditions and methods were not evaluated across multiple datasets. Our work makes three contributions towards addressing these limitations. First, we present the MPIIGaze that contains 213,659 full face images and corresponding ground-truth gaze positions collected from 15 users during everyday laptop use over several months. An experience sampling approach ensured continuous gaze and head poses and realistic variation in eye appearance and illumination. To facilitate cross-dataset evaluations, 37,667 images were manually annotated with eye corners, mouth corners, and pupil centres. Second, we present an extensive evaluation of state-of-the-art gaze estimation methods on three current datasets, including MPIIGaze. We study key challenges including target gaze range, illumination conditions, and facial appearance variation. We show that image resolution and the use of both eyes affect gaze estimation performance while head pose and pupil centre information are less informative. Finally, we propose GazeNet, the first deep appearance-based gaze estimation method. GazeNet improves the state of the art by 22 degrees to 10.8 degrees) for the most challenging cross-dataset evaluation.

READ FULL TEXT

page 2

page 4

page 5

page 9

page 11

page 12

page 14

research
04/11/2015

Appearance-Based Gaze Estimation in the Wild

Appearance-based gaze estimation is believed to work well in real-world ...
research
11/27/2016

It's Written All Over Your Face: Full-Face Appearance-Based Gaze Estimation

Eye gaze is an important non-verbal cue for human affect analysis. Recen...
research
01/30/2023

Accurate Gaze Estimation using an Active-gaze Morphable Model

Rather than regressing gaze direction directly from images, we show that...
research
01/11/2016

3D Gaze Estimation from 2D Pupil Positions on Monocular Head-Mounted Eye Trackers

3D gaze information is important for scene-centric attention analysis bu...
research
01/30/2019

Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applications

Appearance-based gaze estimation methods that only require an off-the-sh...
research
05/12/2018

Learning to Find Eye Region Landmarks for Remote Gaze Estimation in Unconstrained Settings

Conventional feature-based and model-based gaze estimation methods have ...
research
02/06/2020

Driver Gaze Estimation in the Real World: Overcoming the Eyeglass Challenge

A driver's gaze is critical for determining the driver's attention level...

Please sign up or login with your details

Forgot password? Click here to reset