HANDS: A Multimodal Dataset for Modeling Towards Human Grasp Intent Inference in Prosthetic Hands

03/08/2021
by   Mo Han, et al.
6

Upper limb and hand functionality is critical to many activities of daily living and the amputation of one can lead to significant functionality loss for individuals. From this perspective, advanced prosthetic hands of the future are anticipated to benefit from improved shared control between a robotic hand and its human user, but more importantly from the improved capability to infer human intent from multimodal sensor data to provide the robotic hand perception abilities regarding the operational context. Such multimodal sensor data may include various environment sensors including vision, as well as human physiology and behavior sensors including electromyography and inertial measurement units. A fusion methodology for environmental state and human intent estimation can combine these sources of evidence in order to help prosthetic hand motion planning and control. In this paper, we present a dataset of this type that was gathered with the anticipation of cameras being built into prosthetic hands, and computer vision methods will need to assess this hand-view visual evidence in order to estimate human intent. Specifically, paired images from human eye-view and hand-view of various objects placed at different orientations have been captured at the initial state of grasping trials, followed by paired video, EMG and IMU from the arm of the human during a grasp, lift, put-down, and retract style trial structure. For each trial, based on eye-view images of the scene showing the hand and object on a table, multiple humans were asked to sort in decreasing order of preference, five grasp types appropriate for the object in its given configuration relative to the hand. The potential utility of paired eye-view and hand-view images was illustrated by training a convolutional neural network to process hand-view images in order to predict eye-view labels assigned by humans.

READ FULL TEXT

page 2

page 3

page 4

page 5

research
04/08/2021

Multimodal Fusion of EMG and Vision for Human Grasp Intent Inference in Prosthetic Hand Control

For lower arm amputees, robotic prosthetic hands offer the promise to re...
research
03/08/2021

From Hand-Perspective Visual Information to Grasp Type Probabilities: Deep Learning via Ranking Labels

Limb deficiency severely affects the daily lives of amputees and drives ...
research
03/18/2022

Grasp Pre-shape Selection by Synthetic Training: Eye-in-hand Shared Control on the Hannes Prosthesis

We consider the task of object grasping with a prosthetic hand capable o...
research
01/13/2021

Towards Creating a Deployable Grasp Type Probability Estimator for a Prosthetic Hand

For lower arm amputees, prosthetic hands promise to restore most of phys...
research
05/18/2022

i-MYO: A Hybrid Prosthetic Hand Control System based on Eye-tracking, Augmented Reality and Myoelectric signal

Dexterous prosthetic hands have better grasp performance than traditiona...
research
01/16/2013

Deep Learning for Detecting Robotic Grasps

We consider the problem of detecting robotic grasps in an RGB-D view of ...

Please sign up or login with your details

Forgot password? Click here to reset