Standard and Event Cameras Fusion for Dense Mapping

02/06/2021
by   Yan Dong, et al.
0

Event cameras are a kind of bio-inspired sensors that generate data when the brightness changes. Because of the advantages of low-latency and high dynamic range (HDR), they are widely used in the field of mobile robots. However, due to the nature of the sparse event stream, event-based mapping can only obtain sparse or semi-dense edge 3D maps. By contrast, standard cameras provide complete frames. To leverage the complementarity of event-based and standard frame-based cameras, we propose a fusion strategy for dense mapping in this paper. We first generate an edge map from events, and then fill the map using frames to obtain the dense depth map.

READ FULL TEXT

page 1

page 3

research
07/12/2016

Event-based, 6-DOF Camera Tracking from Photometric Depth Maps

Event cameras are bio-inspired vision sensors that output pixel-level br...
research
11/22/2022

FE-Fusion-VPR: Attention-based Multi-Scale Network Architecture for Visual Place Recognition by Fusing Frames and Events

Traditional visual place recognition (VPR), usually using standard camer...
research
07/27/2022

Traffic Sign Detection With Event Cameras and DCNN

In recent years, event cameras (DVS - Dynamic Vision Sensors) have been ...
research
02/28/2022

Bina-Rep Event Frames: a Simple and Effective Representation for Event-based cameras

This paper presents "Bina-Rep", a simple representation method that conv...
research
07/25/2018

Asynchronous, Photometric Feature Tracking using Events and Frames

We present a method that leverages the complementarity of event cameras ...
research
10/16/2020

Learning Monocular Dense Depth from Events

Event cameras are novel sensors that output brightness changes in the fo...
research
09/01/2023

Dense Voxel 3D Reconstruction Using a Monocular Event Camera

Event cameras are sensors inspired by biological systems that specialize...

Please sign up or login with your details

Forgot password? Click here to reset