Can Gaze Beat Touch? A Fitts' Law Evaluation of Gaze, Touch, and Mouse Inputs

08/02/2022
by   Vijay Rajanna, et al.
0

Gaze input has been a promising substitute for mouse input for point and select interactions. Individuals with severe motor and speech disabilities primarily rely on gaze input for communication. Gaze input also serves as a hands-free input modality in the scenarios of situationally-induced impairments and disabilities (SIIDs). Hence, the performance of gaze input has often been compared to mouse input through standardized performance evaluation procedure like the Fitts' Law. With the proliferation of touch-enabled devices such as smartphones, tablet PCs, or any computing device with a touch surface, it is also important to compare the performance of gaze input to touch input. In this study, we conducted ISO 9241-9 Fitts' Law evaluation to compare the performance of multimodal gaze and foot-based input to touch input in a standard desktop environment, while using mouse input as the baseline. From a study involving 12 participants, we found that the gaze input has the lowest throughput (2.55 bits/s), and the highest movement time (1.04 s) of the three inputs. In addition, though touch input involves maximum physical movements, it achieved the highest throughput (6.67 bits/s), the least movement time (0.5 s), and was the most preferred input. While there are similarities in how quickly pointing can be moved from source to target location when using both gaze and touch inputs, target selection consumes maximum time with gaze input. Hence, with a throughput that is over 160 superior input modality.

READ FULL TEXT

page 4

page 5

page 6

research
03/13/2018

A Gaze-Assisted Multimodal Approach to Rich and Accessible Human-Computer Interaction

Recent advancements in eye tracking technology are driving the adoption ...
research
11/15/2019

The Markup Language for Designing Gaze Controlled Applications

The Gaze Interaction Markup Language (GIML) is presented, which is new l...
research
11/16/2019

Unsupervised Representation Learning for Gaze Estimation

Although automatic gaze estimation is very important to a large variety ...
research
03/05/2018

Continuous Affect Prediction using Eye Gaze

In recent times, there has been significant interest in the machine reco...
research
02/19/2020

EyeTAP: A Novel Technique using Voice Inputs to Address the Midas Touch Problem for Gaze-based Interactions

One of the main challenges of gaze-based interactions is the ability to ...
research
07/26/2022

Multimodal-GuideNet: Gaze-Probe Bidirectional Guidance in Obstetric Ultrasound Scanning

Eye trackers can provide visual guidance to sonographers during ultrasou...
research
04/17/2021

Gaze Perception in Humans and CNN-Based Model

Making accurate inferences about other individuals' locus of attention i...

Please sign up or login with your details

Forgot password? Click here to reset