TransCG: A Large-Scale Real-World Dataset for Transparent Object Depth Completion and Grasping

02/17/2022
by   Hongjie Fang, et al.
1

Transparent objects are common in our daily life and frequently handled in the automated production line. Robust vision-based robotic grasping and manipulation for these objects would be beneficial for automation. However, the majority of current grasping algorithms would fail in this case since they heavily rely on the depth image, while ordinary depth sensors usually fail to produce accurate depth information for transparent objects owing to the reflection and refraction of light. In this work, we address this issue by contributing a large-scale real-world dataset for transparent object depth completion, which contains 57,715 RGB-D images from 130 different scenes. Our dataset is the first large-scale real-world dataset and provides the most comprehensive annotation. Cross-domain experiments show that our dataset has a great generalization ability. Moreover, we propose an end-to-end depth completion network, which takes the RGB image and the inaccurate depth map as inputs and outputs a refined depth map. Experiments demonstrate superior efficacy, efficiency and robustness of our method over previous works, and it is able to process images of high resolutions under limited hardware resources. Real robot experiment shows that our method can also be applied to novel object grasping robustly. The full dataset and our method are publicly available at www.graspnet.net/transcg.

READ FULL TEXT

page 1

page 3

page 4

page 5

page 6

research
11/15/2022

Grasping the Inconspicuous

Transparent objects are common in day-to-day life and hence find many ap...
research
09/26/2022

MonoGraspNet: 6-DoF Grasping with a Single RGB Image

6-DoF robotic grasping is a long-lasting but unsolved problem. Recent me...
research
09/30/2021

Seeing Glass: Joint Point Cloud and Depth Completion for Transparent Objects

The basis of many object manipulation algorithms is RGB-D input. Yet, co...
research
07/09/2021

Using Depth for Improving Referring Expression Comprehension in Real-World Environments

In a human-robot collaborative task where a robot helps its partner by f...
research
06/16/2020

Depth by Poking: Learning to Estimate Depth from Self-Supervised Grasping

Accurate depth estimation remains an open problem for robotic manipulati...
research
08/20/2022

Where Shall I Touch? Vision-Guided Tactile Poking for Transparent Object Grasping

Picking up transparent objects is still a challenging task for robots. T...
research
07/11/2023

TRansPose: Large-Scale Multispectral Dataset for Transparent Object

Transparent objects are encountered frequently in our daily lives, yet r...

Please sign up or login with your details

Forgot password? Click here to reset