Grasping Field: Learning Implicit Representations for Human Grasps

08/10/2020
by   Korrawe Karunratanakul, et al.
10

In recent years, substantial progress has been made on robotic grasping of household objects. Yet, human grasps are still difficult to synthesize realistically. There are several key reasons: (1) the human hand has many degrees of freedom (more than robotic manipulators); (2) the synthesized hand should conform naturally to the object surface; and (3) it must interact with the object in a semantically and physical plausible manner. To make progress in this direction, we draw inspiration from the recent progress on learning-based implicit representations for 3D object reconstruction. Specifically, we propose an expressive representation for human grasp modelling that is efficient and easy to integrate with deep neural networks. Our insight is that every point in a three-dimensional space can be characterized by the signed distances to the surface of the hand and the object, respectively. Consequently, the hand, the object, and the contact area can be represented by implicit surfaces in a common space, in which the proximity between the hand and the object can be modelled explicitly. We name this 3D to 2D mapping as Grasping Field, parameterize it with a deep neural network, and learn it from data. We demonstrate that the proposed grasping field is an effective and expressive representation for human grasp generation. Specifically, our generative model is able to synthesize high-quality human grasps, given only on a 3D object point cloud. The extensive experiments demonstrate that our generative model compares favorably with a strong baseline. Furthermore, based on the grasping field representation, we propose a deep network for the challenging task of 3D hand and object reconstruction from a single RGB image. Our method improves the physical plausibility of the 3D hand-object reconstruction task over baselines.

READ FULL TEXT

page 16

page 21

page 23

page 24

research
12/19/2021

SAGA: Stochastic Whole-Body Grasping with Contact

Human grasping synthesis has numerous applications including AR/VR, vide...
research
07/06/2022

NeuralGrasps: Learning Implicit Representations for Grasps of Multiple Robotic Hands

We introduce a neural implicit representation for grasps of objects from...
research
07/17/2019

Towards Markerless Grasp Capture

Humans excel at grasping objects and manipulating them. Capturing human ...
research
11/17/2020

Knowledge-Augmented Dexterous Grasping with Incomplete Sensing

Humans can determine a proper strategy to grasp an object according to t...
research
09/23/2021

A Skeleton-Driven Neural Occupancy Representation for Articulated Hands

We present Hand ArticuLated Occupancy (HALO), a novel representation of ...
research
07/01/2022

Learning Diverse and Physically Feasible Dexterous Grasps with Generative Model and Bilevel Optimization

To fully utilize the versatility of a multi-finger dexterous robotic han...

Please sign up or login with your details

Forgot password? Click here to reset