6-DoF Robotic Grasping with Transformer

01/29/2023
by   Zhenjie Zhao, et al.
0

Robotic grasping aims to detect graspable points and their corresponding gripper configurations in a particular scene, and is fundamental for robot manipulation. Existing research works have demonstrated the potential of using a transformer model for robotic grasping, which can efficiently learn both global and local features. However, such methods are still limited in grasp detection on a 2D plane. In this paper, we extend a transformer model for 6-Degree-of-Freedom (6-DoF) robotic grasping, which makes it more flexible and suitable for tasks that concern safety. The key designs of our method are a serialization module that turns a 3D voxelized space into a sequence of feature tokens that a transformer model can consume and skip-connections that merge multiscale features effectively. In particular, our method takes a Truncated Signed Distance Function (TSDF) as input. After serializing the TSDF, a transformer model is utilized to encode the sequence, which can obtain a set of aggregated hidden feature vectors through multi-head attention. We then decode the hidden features to obtain per-voxel feature vectors through deconvolution and skip-connections. Voxel feature vectors are then used to regress parameters for executing grasping actions. On a recently proposed pile and packed grasping dataset, we showcase that our transformer-based method can surpass existing methods by about 5 evaluate the running time and generalization ability to demonstrate the superiority of the proposed method.

READ FULL TEXT
research
02/24/2022

When Transformer Meets Robotic Grasping: Exploits Context for Efficient Grasp Detection

In this paper, we present a transformer-based architecture, namely TF-Gr...
research
07/01/2021

TransSC: Transformer-based Shape Completion for Grasp Evaluation

Currently, robotic grasping methods based on sparse partial point clouds...
research
03/03/2020

EGAD! an Evolved Grasping Analysis Dataset for diversity and reproducibility in robotic manipulation

We present the Evolved Grasping Analysis Dataset (EGAD), comprising over...
research
09/13/2022

What You See is What You Grasp: User-Friendly Grasping Guided by Near-eye-tracking

This work presents a next-generation human-robot interface that can infe...
research
07/06/2022

Deep Learning Approaches to Grasp Synthesis: A Review

Grasping is the process of picking an object by applying forces and torq...
research
06/30/2019

GarmNet: Improving Global with Local Perception for Robotic Laundry Folding

Developing autonomous assistants to help with domestic tasks is a vital ...

Please sign up or login with your details

Forgot password? Click here to reset