A Gaze-Assisted Multimodal Approach to Rich and Accessible Human-Computer Interaction

03/13/2018
by   Vijay Rajanna, et al.
0

Recent advancements in eye tracking technology are driving the adoption of gaze-assisted interaction as a rich and accessible human-computer interaction paradigm. Gaze-assisted interaction serves as a contextual, non-invasive, and explicit control method for users without disabilities; for users with motor or speech impairments, text entry by gaze serves as the primary means of communication. Despite significant advantages, gaze-assisted interaction is still not widely accepted because of its inherent limitations: 1) Midas touch, 2) low accuracy for mouse-like interactions, 3) need for repeated calibration, 4) visual fatigue with prolonged usage, 5) lower gaze typing speed, and so on. This dissertation research proposes a gaze-assisted, multimodal, interaction paradigm, and related frameworks and their applications that effectively enable gaze-assisted interactions while addressing many of the current limitations. In this regard, we present four systems that leverage gaze-assisted interaction: 1) a gaze- and foot-operated system for precise point-and-click interactions, 2) a dwell-free, foot-operated gaze typing system. 3) a gaze gesture-based authentication system, and 4) a gaze gesture-based interaction toolkit. In addition, we also present the goals to be achieved, technical approach, and overall contributions of this dissertation research.

READ FULL TEXT

page 2

page 3

research
01/30/2019

Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applications

Appearance-based gaze estimation methods that only require an off-the-sh...
research
08/02/2022

Can Gaze Beat Touch? A Fitts' Law Evaluation of Gaze, Touch, and Mouse Inputs

Gaze input has been a promising substitute for mouse input for point and...
research
03/21/2018

Gaze-Assisted User Authentication to Counter Shoulder-surfing Attacks

A highly secure, foolproof, user authentication system is still a primar...
research
06/13/2016

Using Virtual Humans to Understand Real Ones

Human interactions are characterized by explicit as well as implicit cha...
research
02/19/2020

EyeTAP: A Novel Technique using Voice Inputs to Address the Midas Touch Problem for Gaze-based Interactions

One of the main challenges of gaze-based interactions is the ability to ...
research
04/19/2023

DynamicRead: Exploring Robust Gaze Interaction Methods for Reading on Handheld Mobile Devices under Dynamic Conditions

Enabling gaze interaction in real-time on handheld mobile devices has at...
research
01/31/2022

Beyond synchronization: Body gestures and gaze direction in duo performance

In this chapter, we focus on two main categories of visual interaction: ...

Please sign up or login with your details

Forgot password? Click here to reset