Design and Implementation of a Human-Robot Joint Action Framework using Augmented Reality and Eye Gaze

08/25/2022
by   Wesley P. Chan, et al.
0

When humans work together to complete a joint task, each person builds an internal model of the situation and how it will evolve. Efficient collaboration is dependent on how these individual models overlap to form a shared mental model among team members, which is important for collaborative processes in human-robot teams. The development and maintenance of an accurate shared mental model requires bidirectional communication of individual intent and the ability to interpret the intent of other team members. To enable effective human-robot collaboration, this paper presents a design and implementation of a novel joint action framework in human-robot team collaboration, utilizing augmented reality (AR) technology and user eye gaze to enable bidirectional communication of intent. We tested our new framework through a user study with 37 participants, and found that our system improves task efficiency, trust, as well as task fluency. Therefore, using AR and eye gaze to enable bidirectional communication is a promising mean to improve core components that influence collaboration between humans and robots.

READ FULL TEXT

page 2

page 5

page 6

page 7

research
09/24/2019

Negotiation-based Human-Robot Collaboration via Augmented Reality

Effective human-robot collaboration (HRC) requires extensive communicati...
research
05/08/2023

ARDIE: AR, Dialogue, and Eye Gaze Policies for Human-Robot Collaboration

Human-robot collaboration (HRC) has become increasingly relevant in indu...
research
10/27/2022

Coordination with Humans via Strategy Matching

Human and robot partners increasingly need to work together to perform t...
research
06/22/2023

Investigating the Usability of Collaborative Robot control through Hands-Free Operation using Eye gaze and Augmented Reality

This paper proposes a novel operation for controlling a mobile robot usi...
research
10/05/2020

Projection Mapping Implementation: Enabling Direct Externalization of Perception Results and Action Intent to Improve Robot Explainability

Existing research on non-verbal cues, e.g., eye gaze or arm movement, ma...
research
07/30/2018

HARMONIC: A Multimodal Dataset of Assistive Human-Robot Collaboration

We present HARMONIC, a large multi-modal dataset of human interactions i...
research
08/10/2017

Givers & Receivers perceive handover tasks differently: Implications for Human-Robot collaborative system design

Human-human joint-action in short-cycle repetitive handover tasks was in...

Please sign up or login with your details

Forgot password? Click here to reset