Learning Any-View 6DoF Robotic Grasping in Cluttered Scenes via Neural Surface Rendering

06/12/2023
by   Snehal Jauhri, et al.
0

Robotic manipulation is critical for admitting robotic agents to various application domains, like intelligent assistance. A major challenge therein is the effective 6DoF grasping of objects in cluttered environments from any viewpoint without requiring additional scene exploration. We introduce NeuGraspNet, a novel method for 6DoF grasp detection that leverages recent advances in neural volumetric representations and surface rendering. Our approach learns both global (scene-level) and local (grasp-level) neural surface representations, enabling effective and fully implicit 6DoF grasp quality prediction, even in unseen parts of the scene. Further, we reinterpret grasping as a local neural surface rendering problem, allowing the model to encode the interaction between the robot's end-effector and the object's surface geometry. NeuGraspNet operates on single viewpoints and can sample grasp candidates in occluded scenes, outperforming existing implicit and semi-implicit baseline methods in the literature. We demonstrate the real-world applicability of NeuGraspNet with a mobile manipulator robot, grasping in open spaces with clutter by rendering the scene, reasoning about graspable areas of different objects, and selecting grasps likely to succeed without colliding with the environment. Visit our project website: https://sites.google.com/view/neugraspnet

READ FULL TEXT

page 3

page 9

page 14

page 15

page 16

research
09/18/2023

Affordance-Driven Next-Best-View Planning for Robotic Grasping

Grasping occluded objects in cluttered environments is an essential comp...
research
10/21/2022

Neural Fields for Robotic Object Manipulation from a Single Image

We present a unified and compact representation for object rendering, 3D...
research
12/09/2021

Learning Neural Implicit Functions as Object Representations for Robotic Manipulation

Robotic manipulation planning is the problem of finding a sequence of ro...
research
07/31/2022

DA^2 Dataset: Toward Dexterity-Aware Dual-Arm Grasping

In this paper, we introduce DA^2, the first large-scale dual-arm dexteri...
research
04/30/2023

Object-Centric Voxelization of Dynamic Scenes via Inverse Neural Rendering

Understanding the compositional dynamics of the world in unsupervised 3D...
research
10/16/2020

Probabilistic Surface Friction Estimation Based on Visual and Haptic Measurements

Accurately modeling local surface properties of objects is crucial to ma...
research
11/02/2022

Learning to Grasp the Ungraspable with Emergent Extrinsic Dexterity

A simple gripper can solve more complex manipulation tasks if it can uti...

Please sign up or login with your details

Forgot password? Click here to reset