DynamicRead: Exploring Robust Gaze Interaction Methods for Reading on Handheld Mobile Devices under Dynamic Conditions

04/19/2023
by   Yaxiong Lei, et al.
0

Enabling gaze interaction in real-time on handheld mobile devices has attracted significant attention in recent years. An increasing number of research projects have focused on sophisticated appearance-based deep learning models to enhance the precision of gaze estimation on smartphones. This inspires important research questions, including how the gaze can be used in a real-time application, and what type of gaze interaction methods are preferable under dynamic conditions in terms of both user acceptance and delivering reliable performance. To address these questions, we design four types of gaze scrolling techniques: three explicit technique based on Gaze Gesture, Dwell time, and Pursuit; and one implicit technique based on reading speed to support touch-free, page-scrolling on a reading application. We conduct a 20-participant user study under both sitting and walking settings and our results reveal that Gaze Gesture and Dwell time-based interfaces are more robust while walking and Gaze Gesture has achieved consistently good scores on usability while not causing high cognitive workload.

READ FULL TEXT

page 1

page 12

research
06/30/2023

An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile Devices

In recent years we have witnessed an increasing number of interactive sy...
research
07/27/2023

Autocalibrating Gaze Tracking: A Demonstration through Gaze Typing

Miscalibration of gaze tracking devices and the resulting need for repea...
research
11/15/2019

The Markup Language for Designing Gaze Controlled Applications

The Gaze Interaction Markup Language (GIML) is presented, which is new l...
research
03/13/2018

A Gaze-Assisted Multimodal Approach to Rich and Accessible Human-Computer Interaction

Recent advancements in eye tracking technology are driving the adoption ...
research
08/10/2023

FocusFlow: Leveraging Focal Depth for Gaze Interaction in Virtual Reality

Current gaze input methods for VR headsets predominantly utilize the gaz...
research
01/31/2022

Beyond synchronization: Body gestures and gaze direction in duo performance

In this chapter, we focus on two main categories of visual interaction: ...
research
06/13/2016

Using Virtual Humans to Understand Real Ones

Human interactions are characterized by explicit as well as implicit cha...

Please sign up or login with your details

Forgot password? Click here to reset