Event-based Robotic Grasping Detection with Neuromorphic Vision Sensor and Event-Stream Dataset

04/28/2020
by   Bin Li, et al.
4

Robotic grasping plays an important role in the field of robotics. The current state-of-the-art robotic grasping detection systems are usually built on the conventional vision, such as RGB-D camera. Compared to traditional frame-based computer vision, neuromorphic vision is a small and young community of research. Currently, there are limited event-based datasets due to the troublesome annotation of the asynchronous event stream. Annotating large scale vision dataset often takes lots of computation resources, especially the troublesome data for video-level annotation. In this work, we consider the problem of detecting robotic grasps in a moving camera view of a scene containing objects. To obtain more agile robotic perception, a neuromorphic vision sensor (DAVIS) attaching to the robot gripper is introduced to explore the potential usage in grasping detection. We construct a robotic grasping dataset named Event-Stream Dataset with 91 objects. For each object, there are 4020 successive grasping annotations in different views with a time resolution of 1 ms. A spatio-temporal mixed particle filter (SMP Filter) is proposed to track the led-based grasp rectangles which enables video-level annotation of a single grasp rectangle per object. As leds blink at high frequency, the Event-Stream dataset is annotated in a high frequency of 1 kHz. Based on the Event-Stream dataset, we develop a deep neural network for grasping detection which consider the angle learning problem as classification instead of regression. The method performs high detection accuracy on our Event-Stream dataset with 93% precision at object-wise level. This work provides a large-scale and well-annotated dataset, and promotes the neuromorphic vision applications in agile robot.

READ FULL TEXT

page 1

page 2

page 7

page 10

page 11

page 12

research
07/15/2021

Real-Time Grasping Strategies Using Event Camera

Robotic vision plays a key role for perceiving the environment in graspi...
research
12/14/2022

Event-based YOLO Object Detection: Proof of Concept for Forward Perception System

Neuromorphic vision or event vision is an advanced vision technology, wh...
research
09/04/2023

High Frequency, High Accuracy Pointing onboard Nanosats using Neuromorphic Event Sensing and Piezoelectric Actuation

As satellites become smaller, the ability to maintain stable pointing de...
research
04/29/2021

REGRAD: A Large-Scale Relational Grasp Dataset for Safe and Object-Specific Robotic Grasping in Clutter

Despite the impressive progress achieved in robust grasp detection, robo...
research
05/24/2022

EventMix: An Efficient Augmentation Strategy for Event-Based Data

High-quality and challenging event stream datasets play an important rol...
research
04/15/2020

Neuromorphic Eye-in-Hand Visual Servoing

Robotic vision plays a major role in factory automation to service robot...
research
04/15/2020

Neuromorphic Event-Based Slip Detection and suppression in Robotic Grasping and Manipulation

Slip detection is essential for robots to make robust grasping and fine ...

Please sign up or login with your details

Forgot password? Click here to reset