Eyes on the Prize: Improved Perception for Robust Dynamic Grasping

04/29/2022
by   Ben Burgess-Limerick, et al.
0

This paper is concerned with perception challenges for robust grasping in the presence of clutter and unpredictable relative motion between robot and object. Traditional perception systems developed for static grasping are unable to provide feedback during the final phase of a grasp due to sensor minimum range, occlusion, and a limited field of view. A multi-camera eye-in-hand perception system is presented that has advantages over commonly used camera configurations. We quantitatively evaluate the performance on a real robot with an image-based visual servoing grasp controller and show a significantly improved success rate on a dynamic grasping task. A fully reproducible open-source testing system is described to encourage benchmarking of dynamic grasping system performance.

READ FULL TEXT

page 1

page 3

page 4

research
09/23/2018

Multi-View Picking: Next-best-view Reaching for Improved Grasping in Clutter

Camera viewpoint selection is an important aspect of visual grasp detect...
research
12/16/2022

AnyGrasp: Robust and Efficient Grasp Perception in Spatial and Temporal Domains

As the basis for prehensile manipulation, it is vital to enable robots t...
research
02/11/2020

Reaching, Grasping and Re-grasping: Learning Multimode Grasping Skills

The ability to adapt to uncertainties, recover from failures, and coordi...
research
06/16/2021

GKNet: grasp keypoint network for grasp candidates detection

Contemporary grasp detection approaches employ deep learning to achieve ...
research
05/26/2022

Grasping as Inference: Reactive Grasping in Heavily Cluttered Environment

Although, in the task of grasping via a data-driven method, closed-loop ...
research
08/11/2023

Aggressive Aerial Grasping using a Soft Drone with Onboard Perception

Contrary to the stunning feats observed in birds of prey, aerial manipul...

Please sign up or login with your details

Forgot password? Click here to reset