Grasp-type Recognition Leveraging Object Affordance

08/26/2020
by   Naoki Wake, et al.
0

A key challenge in robot teaching is grasp-type recognition with a single RGB image and a target object name. Here, we propose a simple yet effective pipeline to enhance learning-based recognition by leveraging a prior distribution of grasp types for each object. In the pipeline, a convolutional neural network (CNN) recognizes the grasp type from an RGB image. The recognition result is further corrected using the prior distribution (i.e., affordance), which is associated with the target object name. Experimental results showed that the proposed method outperforms both a CNN-only and an affordance-only method. The results highlight the effectiveness of linguistically-driven object affordance for enhancing grasp-type recognition in robot teaching.

READ FULL TEXT
research
02/27/2021

Object affordance as a guide for grasp-type recognition

Recognizing human grasping strategies is an important factor in robot te...
research
05/11/2022

DcnnGrasp: Towards Accurate Grasp Pattern Recognition with Adaptive Regularizer Learning

The task of grasp pattern recognition aims to derive the applicable gras...
research
06/20/2017

Recognition of Grasp Points for Clothes Manipulation under unconstrained Conditions

In this work a system for recognizing grasp points in RGB-D images is pr...
research
03/03/2021

RGB Matters: Learning 7-DoF Grasp Poses on Monocular RGBD Images

General object grasping is an important yet unsolved problem in the fiel...
research
09/18/2018

SilhoNet: An RGB Method for 3D Object Pose Estimation and Grasp Planning

Autonomous robot manipulation often involves both estimating the pose of...
research
03/01/2022

Data-efficient learning of object-centric grasp preferences

Grasping made impressive progress during the last few years thanks to de...
research
08/07/2019

Grasp Type Estimation for Myoelectric Prostheses using Point Cloud Feature Learning

Prosthetic hands can help people with limb difference to return to their...

Please sign up or login with your details

Forgot password? Click here to reset