EyeTAP: A Novel Technique using Voice Inputs to Address the Midas Touch Problem for Gaze-based Interactions

02/19/2020
by   Mohsen Parisay, et al.
1

One of the main challenges of gaze-based interactions is the ability to distinguish normal eye function from a deliberate interaction with the computer system, commonly referred to as 'Midas touch'. In this paper we propose, EyeTAP (Eye tracking point-and-select by Targeted Acoustic Pulse) a hands-free interaction method for point-and-select tasks. We evaluated the prototype in two separate user studies, each containing two experiments with 33 participants and found that EyeTAP is robust even in presence of ambient noise in the audio input signal with tolerance of up to 70 dB, results in a faster movement time, and faster task completion time, and has a lower cognitive workload than voice recognition. In addition, EyeTAP has a lower error rate than the dwell-time method in a ribbon-shaped experiment. These characteristics make it applicable for users for whom physical movements are restricted or not possible due to a disability. Furthermore, EyeTAP has no specific requirements in terms of user interface design and therefore it can be easily integrated into existing systems with minimal modifications. EyeTAP can be regarded as an acceptable alternative to address the Midas touch.

READ FULL TEXT

page 5

page 7

page 8

page 9

page 10

research
09/10/2020

Non-contact Real time Eye Gaze Mapping System Based on Deep Convolutional Neural Network

Human-Computer Interaction(HCI) is a field that studies interactions bet...
research
07/09/2019

Image based Eye Gaze Tracking and its Applications

Eye movements play a vital role in perceiving the world. Eye gaze can gi...
research
03/13/2018

A Gaze-Assisted Multimodal Approach to Rich and Accessible Human-Computer Interaction

Recent advancements in eye tracking technology are driving the adoption ...
research
08/02/2022

Can Gaze Beat Touch? A Fitts' Law Evaluation of Gaze, Touch, and Mouse Inputs

Gaze input has been a promising substitute for mouse input for point and...
research
03/14/2017

Tracking Gaze and Visual Focus of Attention of People Involved in Social Interaction

The visual focus of attention (VFOA) has been recognized as a prominent ...
research
03/28/2023

TicTacToes: Assessing Toe Movements as an Input Modality

From carrying grocery bags to holding onto handles on the bus, there are...
research
01/24/2023

WhisperWand: Simultaneous Voice and Gesture Tracking Interface

This paper presents the design and implementation of WhisperWand, a comp...

Please sign up or login with your details

Forgot password? Click here to reset