FocusFlow: Leveraging Focal Depth for Gaze Interaction in Virtual Reality

08/10/2023
by   Chenyang Zhang, et al.
0

Current gaze input methods for VR headsets predominantly utilize the gaze ray as a pointing cursor, often neglecting depth information in it. This study introduces FocusFlow, a novel gaze interaction technique that integrates focal depth into gaze input dimensions, facilitating users to actively shift their focus along the depth dimension for interaction. A detection algorithm to identify the user's focal depth is developed. Based on this, a layer-based UI is proposed, which uses focal depth changes to enable layer switch operations, offering an intuitive hands-free selection method. We also designed visual cues to guide users to adjust focal depth accurately and get familiar with the interaction process. Preliminary evaluations demonstrate the system's usability, and several potential applications are discussed. Through FocusFlow, we aim to enrich the input dimensions of gaze interaction, achieving more intuitive and efficient human-computer interactions on headset devices.

READ FULL TEXT

page 1

page 3

page 4

research
04/18/2022

Interaction Design of Dwell Selection Toward Gaze-based AR/VR Interaction

In this paper, we first position the current dwell selection among gaze-...
research
11/09/2017

Fast camera focus estimation for gaze-based focus control

Many cameras implement auto-focus functionality. However, they typically...
research
09/07/2020

Back to the Future: Revisiting Mouse and Keyboard Interaction for HMD-based Immersive Analytics

With the rise of natural user interfaces, immersive analytics applicatio...
research
07/06/2022

Gaze-Vergence-Controlled See-Through Vision in Augmented Reality

Augmented Reality (AR) see-through vision is an interesting research top...
research
10/24/2022

Content Transfer Across Multiple Screens with Combined Eye-Gaze and Touch Interaction – A Replication Study

In this paper, we describe the results of replicating one of our studies...
research
04/19/2023

DynamicRead: Exploring Robust Gaze Interaction Methods for Reading on Handheld Mobile Devices under Dynamic Conditions

Enabling gaze interaction in real-time on handheld mobile devices has at...
research
08/09/2022

The Relative Importance of Depth Cues and Semantic Edges for Indoor Mobility Using Simulated Prosthetic Vision in Immersive Virtual Reality

Visual neuroprostheses (bionic eyes) have the potential to treat degener...

Please sign up or login with your details

Forgot password? Click here to reset