Open Gaze: Open Source eye tracker for smartphone devices using Deep Learning

by   Sushmanth reddy, et al.

Eye tracking has been a pivotal tool in diverse fields such as vision research, language analysis, and usability assessment. The majority of prior investigations, however, have concentrated on expansive desktop displays employing specialized, costly eye tracking hardware that lacks scalability. Remarkably little insight exists into ocular movement patterns on smartphones, despite their widespread adoption and significant usage. In this manuscript, we present an open-source implementation of a smartphone-based gaze tracker that emulates the methodology proposed by a GooglePaper (whose source code remains proprietary). Our focus is on attaining accuracy comparable to that attained through the GooglePaper's methodology, without the necessity for supplementary hardware. Through the integration of machine learning techniques, we unveil an accurate eye tracking solution that is native to smartphones. Our approach demonstrates precision akin to the state-of-the-art mobile eye trackers, which are characterized by a cost that is two orders of magnitude higher. Leveraging the vast MIT GazeCapture dataset, which is available through registration on the dataset's website, we successfully replicate crucial findings from previous studies concerning ocular motion behavior in oculomotor tasks and saliency analyses during natural image observation. Furthermore, we emphasize the applicability of smartphone-based gaze tracking in discerning reading comprehension challenges. Our findings exhibit the inherent potential to amplify eye movement research by significant proportions, accommodating participation from thousands of subjects with explicit consent. This scalability not only fosters advancements in vision research, but also extends its benefits to domains such as accessibility enhancement and healthcare applications.


Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction

Commercial head-mounted eye trackers provide useful features to customer...

TurkerGaze: Crowdsourcing Saliency with Webcam based Eye Tracking

Traditional eye tracking requires specialized hardware, which means coll...

Automated analysis of eye-tracker-based human-human interaction studies

Mobile eye-tracking systems have been available for about a decade now a...

LEyes: A Lightweight Framework for Deep Learning-Based Eye Tracking using Synthetic Eye Images

Deep learning has bolstered gaze estimation techniques, but real-world d...

MLGaze: Machine Learning-Based Analysis of Gaze Error Patterns in Consumer Eye Tracking Systems

Analyzing the gaze accuracy characteristics of an eye tracker is a criti...

GazeBase: A Large-Scale, Multi-Stimulus, Longitudinal Eye Movement Dataset

This manuscript presents GazeBase, a large-scale longitudinal dataset co...

PESAO: Psychophysical Experimental Setup for Active Observers

Most past and present research in computer vision involves passively obs...

Please sign up or login with your details

Forgot password? Click here to reset