Exploring the Effect of Visual Cues on Eye Gaze During AR-Guided Picking and Assembly Tasks

08/10/2021
by   Arne Seeliger, et al.
0

In this paper, we present an analysis of eye gaze patterns pertaining to visual cues in augmented reality (AR) for head-mounted displays (HMDs). We conducted an experimental study involving a picking and assembly task, which was guided by different visual cues. We compare these visual cues along multiple dimensions (in-view vs. out-of-view, static vs. dynamic, sequential vs. simultaneous) and analyze quantitative metrics such as gaze distribution, gaze duration, and gaze path distance. Our results indicate that visual cues in AR significantly affect eye gaze patterns. Specifically, we show that the effect varies depending on the type of visual cue. We discuss these empirical results with respect to visual attention theory.

READ FULL TEXT

page 2

page 3

page 5

research
02/03/2023

See or Hear? Exploring the Effect of Visual and Audio Hints and Gaze-assisted Task Feedback for Visual Search Tasks in Augmented Reality

Augmented reality (AR) is emerging in visual search tasks for increasing...
research
10/08/2021

Effect of Visual Cues on Pointing Tasks in Co-located Augmented Reality Collaboration

Visual cues are essential in computer-mediated communication. It is espe...
research
05/08/2023

ARDIE: AR, Dialogue, and Eye Gaze Policies for Human-Robot Collaboration

Human-robot collaboration (HRC) has become increasingly relevant in indu...
research
02/08/2023

Exploring Affordances for AR in Laparoscopy

This paper explores the possibilities of designing AR interfaces to be u...
research
08/01/2019

Visual cues in estimation of part-to-whole comparison

Pie charts were first published in 1801 by William Playfair and have cau...
research
04/03/2023

Dynamic Accommodation Measurement using Purkinje Images and ML Algorithms

We developed a prototype device for dynamic gaze and accommodation measu...

Please sign up or login with your details

Forgot password? Click here to reset