CaTGrasp: Learning Category-Level Task-Relevant Grasping in Clutter from Simulation

09/19/2021
by   Bowen Wen, et al.
8

Task-relevant grasping is critical for industrial assembly, where downstream manipulation tasks constrain the set of valid grasps. Learning how to perform this task, however, is challenging, since task-relevant grasp labels are hard to define and annotate. There is also yet no consensus on proper representations for modeling or off-the-shelf tools for performing task-relevant grasps. This work proposes a framework to learn task-relevant grasping for industrial objects without the need of time-consuming real-world data collection or manual annotation. To achieve this, the entire framework is trained solely in simulation, including supervised training with synthetic label generation and self-supervised, hand-object interaction. In the context of this framework, this paper proposes a novel, object-centric canonical representation at the category level, which allows establishing dense correspondence across object instances and transferring task-relevant grasps to novel instances. Extensive experiments on task-relevant grasping of densely-cluttered industrial objects are conducted in both simulation and real-world setups, demonstrating the effectiveness of the proposed framework. Code and data will be released upon acceptance at https://sites.google.com/view/catgrasp.

READ FULL TEXT

page 1

page 3

page 5

page 6

research
01/30/2022

You Only Demonstrate Once: Category-Level Manipulation from Single Visual Demonstration

Promising results have been achieved recently in category-level manipula...
research
06/21/2019

Data-Efficient Learning for Sim-to-Real Robotic Grasping using Deep Point Cloud Prediction Networks

Training a deep network policy for robot manipulation is notoriously cos...
research
03/02/2022

Learning Object Relations with Graph Neural Networks for Target-Driven Grasping in Dense Clutter

Robots in the real world frequently come across identical objects in den...
research
03/05/2023

Two-Stage Grasping: A New Bin Picking Framework for Small Objects

This paper proposes a novel bin picking framework, two-stage grasping, a...
research
08/20/2022

Where Shall I Touch? Vision-Guided Tactile Poking for Transparent Object Grasping

Picking up transparent objects is still a challenging task for robots. T...
research
03/08/2023

Grasping Student: semi-supervised learning for robotic manipulation

Gathering real-world data from the robot quickly becomes a bottleneck wh...
research
07/25/2023

GraspGPT: Leveraging Semantic Knowledge from a Large Language Model for Task-Oriented Grasping

Task-oriented grasping (TOG) refers to the problem of predicting grasps ...

Please sign up or login with your details

Forgot password? Click here to reset