PEGG-Net: Background Agnostic Pixel-Wise Efficient Grasp Generation Under Closed-Loop Conditions

03/30/2022
by   Zhiyang Liu, et al.
0

Performing closed-loop grasping at close proximity to an object requires a large field of view. However, such images will inevitably bring large amounts of unnecessary background information, especially when the camera is far away from the target object at the initial stage, resulting in performance degradation of the grasping network. To address this problem, we design a novel PEGG-Net, a real-time, pixel-wise, robotic grasp generation network. The proposed lightweight network is inherently able to learn to remove background noise that can reduce grasping accuracy. Our proposed PEGG-Net achieves improved state-of-the-art performance on both Cornell dataset (98.9 Jacquard dataset (93.8 closed-loop grasping at up to 50Hz using an image size of 480x480 in dynamic environments. The trained model also generalizes to previously unseen objects with complex geometrical shapes, household objects and workshop tools and achieved an overall grasp success rate of 91.2 experiments.

READ FULL TEXT

page 1

page 3

page 4

page 6

research
04/14/2018

Closing the Loop for Robotic Grasping: A Real-time, Generative Grasp Synthesis Approach

This paper presents a real-time, object-independent grasp synthesis meth...
research
03/19/2021

MVGrasp: Real-Time Multi-View 3D Object Grasping in Highly Cluttered Environments

Nowadays service robots are entering more and more in our daily life. In...
research
04/21/2020

Industrial Robot Grasping with Deep Learning using a Programmable Logic Controller (PLC)

Universal grasping of a diverse range of previously unseen objects from ...
research
01/15/2020

DGCM-Net: Dense Geometrical Correspondence Matching Network for Incremental Experience-based Robotic Grasping

This article presents a method for grasping novel objects by learning fr...
research
05/26/2022

Grasping as Inference: Reactive Grasping in Heavily Cluttered Environment

Although, in the task of grasping via a data-driven method, closed-loop ...
research
06/30/2022

EfficientGrasp: A Unified Data-Efficient Learning to Grasp Method for Multi-fingered Robot Hands

Autonomous grasping of novel objects that are previously unseen to a rob...
research
08/26/2020

Self-Supervised Goal-Conditioned Pick and Place

Robots have the capability to collect large amounts of data autonomously...

Please sign up or login with your details

Forgot password? Click here to reset