From Hand-Perspective Visual Information to Grasp Type Probabilities: Deep Learning via Ranking Labels

03/08/2021
by   Mo Han, et al.
16

Limb deficiency severely affects the daily lives of amputees and drives efforts to provide functional robotic prosthetic hands to compensate this deprivation. Convolutional neural network-based computer vision control of the prosthetic hand has received increased attention as a method to replace or complement physiological signals due to its reliability by training visual information to predict the hand gesture. Mounting a camera into the palm of a prosthetic hand is proved to be a promising approach to collect visual data. However, the grasp type labelled from the eye and hand perspective may differ as object shapes are not always symmetric. Thus, to represent this difference in a realistic way, we employed a dataset containing synchronous images from eye- and hand- view, where the hand-perspective images are used for training while the eye-view images are only for manual labelling. Electromyogram (EMG) activity and movement kinematics data from the upper arm are also collected for multi-modal information fusion in future work. Moreover, in order to include human-in-the-loop control and combine the computer vision with physiological signal inputs, instead of making absolute positive or negative predictions, we build a novel probabilistic classifier according to the Plackett-Luce model. To predict the probability distribution over grasps, we exploit the statistical model over label rankings to solve the permutation domain problems via a maximum likelihood estimation, utilizing the manually ranked lists of grasps as a new form of label. We indicate that the proposed model is applicable to the most popular and productive convolutional neural network frameworks.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 6

page 7

research
03/08/2021

HANDS: A Multimodal Dataset for Modeling Towards Human Grasp Intent Inference in Prosthetic Hands

Upper limb and hand functionality is critical to many activities of dail...
research
01/13/2021

Towards Creating a Deployable Grasp Type Probability Estimator for a Prosthetic Hand

For lower arm amputees, prosthetic hands promise to restore most of phys...
research
04/08/2021

Multimodal Fusion of EMG and Vision for Human Grasp Intent Inference in Prosthetic Hand Control

For lower arm amputees, robotic prosthetic hands offer the promise to re...
research
03/07/2016

Learning Hand-Eye Coordination for Robotic Grasping with Deep Learning and Large-Scale Data Collection

We describe a learning-based approach to hand-eye coordination for robot...
research
03/18/2022

Grasp Pre-shape Selection by Synthetic Training: Eye-in-hand Shared Control on the Hannes Prosthesis

We consider the task of object grasping with a prosthetic hand capable o...
research
10/28/2019

Virtual Piano using Computer Vision

In this research, Piano performances have been analyzed only based on vi...

Please sign up or login with your details

Forgot password? Click here to reset