DART: Distribution Aware Retinal Transform for Event-based Cameras

10/30/2017
by   Bharath Ramesh, et al.
0

We introduce a new event-based visual descriptor, termed as distribution aware retinal transform (DART), for pattern recognition using silicon retina cameras. The DART descriptor captures the information of the spatio-temporal distribution of events, and forms a rich structural representation. Consequently, the event context encoded by DART greatly simplifies the feature correspondence problem, which is highly relevant to many event-based vision problems. The proposed descriptor is robust to scale and rotation variations without the need for spectral analysis. To demonstrate the effectiveness of the DART descriptors, they are employed as local features in the bag-of-features classification framework. The proposed framework is tested on the N-MNIST, MNIST-DVS, CIFAR10-DVS, NCaltech-101 datasets, as well as a new object dataset, N-SOD (Neuromorphic-Single Object Dataset), collected to test unconstrained viewpoint recognition. We report a competitive classification accuracy of 97.95 works on the MNIST-DVS (99 Using the in-house N-SOD, we demonstrate real-time classification performance on an Intel Compute Stick directly interfaced to an event camera flying on-board a quadcopter. In addition, taking advantage of the high-temporal resolution of event cameras, the classification system is extended to tackle object tracking. Finally, we demonstrate efficient feature matching for event-based cameras using kd-trees.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 9

page 11

page 12

research
02/14/2020

End-to-end Learning of Object Motion Estimation from Retinal Events for Event-based Object Tracking

Event cameras, which are asynchronous bio-inspired vision sensors, have ...
research
02/09/2018

Shapes Characterization on Address Event Representation Using Histograms of Oriented Events and an Extended LBP Approach

Address Event Representation is a thriving technology that could change ...
research
02/13/2020

Asynchronous Tracking-by-Detection on Adaptive Time Surfaces for Event-based Object Tracking

Event cameras, which are asynchronous bio-inspired vision sensors, have ...
research
07/25/2018

Attention Mechanisms for Object Recognition with Event-Based Cameras

Event-based cameras are neuromorphic sensors capable of efficiently enco...
research
04/29/2016

Learning Compact Structural Representations for Audio Events Using Regressor Banks

We introduce a new learned descriptor for audio signals which is efficie...
research
03/16/2019

Spatiotemporal Feature Learning for Event-Based Vision

Unlike conventional frame-based sensors, event-based visual sensors outp...
research
11/01/2021

Learning Event-based Spatio-Temporal Feature Descriptors via Local Synaptic Plasticity: A Biologically-Plausible Perspective of Computer Vision

We present an optimization-based theory describing spiking cortical ense...

Please sign up or login with your details

Forgot password? Click here to reset