HARMONIC: A Multimodal Dataset of Assistive Human-Robot Collaboration

07/30/2018
by   Benjamin A. Newman, et al.
0

We present HARMONIC, a large multi-modal dataset of human interactions in a shared autonomy setting. The dataset provides human, robot, and environment data streams from twenty-four people engaged in an assistive eating task with a 6 degree-of-freedom (DOF) robot arm. From each participant, we recorded video of both eyes, egocentric video from a head-mounted camera, joystick commands, electromyography from the participant's forearm used to operate the joystick, third person stereo video, and the joint positions of the 6 DOF robot arm. Also included are several data streams that come as a direct result of these recordings, namely eye gaze fixations in the egocentric camera frame and body position skeletons. This dataset could be of interest to researchers studying intention prediction, human mental state modeling, and shared autonomy. Data streams are provided in a variety of formats such as video and human-readable csv or yaml files.

READ FULL TEXT
research
02/22/2021

HAIR: Head-mounted AR Intention Recognition

Human teams exhibit both implicit and explicit intention sharing. To fur...
research
08/25/2023

iCub Detecting Gazed Objects: A Pipeline Estimating Human Attention

This paper explores the role of eye gaze in human-robot interactions and...
research
08/25/2022

Design and Implementation of a Human-Robot Joint Action Framework using Augmented Reality and Eye Gaze

When humans work together to complete a joint task, each person builds a...
research
04/20/2023

Detecting Worker Attention Lapses in Human-Robot Interaction: An Eye Tracking and Multimodal Sensing Study

The advent of industrial robotics and autonomous systems endow human-rob...
research
08/31/2022

The Magni Human Motion Dataset: Accurate, Complex, Multi-Modal, Natural, Semantically-Rich and Contextualized

Rapid development of social robots stimulates active research in human m...
research
06/08/2023

Underwater Intention Recognition using Head Motion and Throat Vibration for Supernumerary Robotic Assistance

This study presents a multi-modal mechanism for recognizing human intent...
research
08/03/2021

An Analysis of Human-Robot Information Streams to Inform Dynamic Autonomy Allocation

A dynamic autonomy allocation framework automatically shifts how much co...

Please sign up or login with your details

Forgot password? Click here to reset