Grasp Pre-shape Selection by Synthetic Training: Eye-in-hand Shared Control on the Hannes Prosthesis

03/18/2022
by   Federico Vasile, et al.
0

We consider the task of object grasping with a prosthetic hand capable of multiple grasp types. In this setting, communicating the intended grasp type often requires a high user cognitive load which can be reduced adopting shared autonomy frameworks. Among these, so-called eye-in-hand systems automatically control the hand aperture and pre-shaping before the grasp, based on visual input coming from a camera on the wrist. In this work, we present an eye-in-hand learning-based approach for hand pre-shape classification from RGB sequences. In order to reduce the need for tedious data collection sessions for training the system, we devise a pipeline for rendering synthetic visual sequences of hand trajectories for the purpose. We tackle the peculiarity of the eye-in-hand setting by means of a model for the human arm trajectories, with domain randomization over relevant visual elements. We develop a sensorized setup to acquire real human grasping sequences for benchmarking and show that, compared on practical use cases, models trained with our synthetic dataset achieve better generalization performance than models trained on real data. We finally integrate our model on the Hannes prosthetic hand and show its practical effectiveness. Our code, real and synthetic datasets will be released upon acceptance.

READ FULL TEXT

page 1

page 3

page 4

research
03/07/2016

Learning Hand-Eye Coordination for Robotic Grasping with Deep Learning and Large-Scale Data Collection

We describe a learning-based approach to hand-eye coordination for robot...
research
12/30/2022

Multi-Finger Haptics: Analysis of Human Hand Grasp towards a Tripod Three-Finger Haptic Grasp model

Grasping is an incredible ability of animals using their arms and limbs ...
research
01/04/2022

Primitive Shape Recognition for Object Grasping

Shape informs how an object should be grasped, both in terms of where an...
research
09/13/2016

Associating Grasp Configurations with Hierarchical Features in Convolutional Neural Networks

In this work, we provide a solution for posturing the anthropomorphic Ro...
research
03/08/2021

HANDS: A Multimodal Dataset for Modeling Towards Human Grasp Intent Inference in Prosthetic Hands

Upper limb and hand functionality is critical to many activities of dail...
research
05/18/2022

i-MYO: A Hybrid Prosthetic Hand Control System based on Eye-tracking, Augmented Reality and Myoelectric signal

Dexterous prosthetic hands have better grasp performance than traditiona...
research
03/08/2021

From Hand-Perspective Visual Information to Grasp Type Probabilities: Deep Learning via Ranking Labels

Limb deficiency severely affects the daily lives of amputees and drives ...

Please sign up or login with your details

Forgot password? Click here to reset