Appearance-Based 3D Gaze Estimation with Personal Calibration

07/02/2018
by   Erik Lindén, et al.
0

We propose a way to incorporate personal calibration into a deep learning model for video-based gaze estimation. Using our method, we show that by calibrating six parameters per person, accuracy can be improved by a factor of 2.2 to 2.5. The number of personal parameters, three per eye, is similar to the number predicted by geometrical models. When evaluated on the MPIIGaze dataset, our estimator performs better than person-specific estimators. To improve generalization, we predict gaze rays in 3D (origin and direction of gaze). In existing datasets, the 3D gaze is underdetermined, since all gaze targets are in the same plane as the camera. Experiments on synthetic data suggest it would be possible to learn accurate 3D gaze from only annotated gaze targets, without annotated eye positions.

READ FULL TEXT

page 5

page 6

research
01/30/2023

Accurate Gaze Estimation using an Active-gaze Morphable Model

Rather than regressing gaze direction directly from images, we show that...
research
05/06/2019

Few-shot Adaptive Gaze Estimation

Inter-personal anatomical differences limit the accuracy of person-indep...
research
04/24/2019

Improving Few-Shot User-Specific Gaze Adaptation via Gaze Redirection Synthesis

As an indicator of human attention gaze is a subtle behavioral cue which...
research
04/03/2023

Dynamic Accommodation Measurement using Purkinje Images and ML Algorithms

We developed a prototype device for dynamic gaze and accommodation measu...
research
04/26/2021

Appearance-based Gaze Estimation With Deep Learning: A Review and Benchmark

Gaze estimation reveals where a person is looking. It is an important cl...
research
09/27/2021

Effect Of Personalized Calibration On Gaze Estimation Using Deep-Learning

With the increase in computation power and the development of new state-...

Please sign up or login with your details

Forgot password? Click here to reset