Learning to Estimate Two Dense Depths from LiDAR and Event Data

02/28/2023
by   Vincent Brebion, et al.
0

Event cameras do not produce images, but rather a continuous flow of events, which encode changes of illumination for each pixel independently and asynchronously. While they output temporally rich information, they lack any depth information which could facilitate their use with other sensors. LiDARs can provide this depth information, but are by nature very sparse, which makes the depth-to-event association more complex. Furthermore, as events represent changes of illumination, they might also represent changes of depth; associating them with a single depth is therefore inadequate. In this work, we propose to address these issues by fusing information from an event camera and a LiDAR using a learning-based approach to estimate accurate dense depth maps. To solve the "potential change of depth" problem, we propose here to estimate two depth maps at each step: one "before" the events happen, and one "after" the events happen. We further propose to use this pair of depths to compute a depth difference for each event, to give them more context. We train and evaluate our network, ALED, on both synthetic and real driving sequences, and show that it is able to predict dense depths with an error reduction of up to 61 of our 2-depths-to-event association, and the usefulness of the depth difference information. Finally, we release SLED, a novel synthetic dataset comprising events, LiDAR point clouds, RGB images, and dense depth maps.

READ FULL TEXT

page 5

page 10

page 12

research
12/08/2021

FPPN: Future Pseudo-LiDAR Frame Prediction for Autonomous Driving

LiDAR sensors are widely used in autonomous driving due to the reliable ...
research
05/06/2023

Target-free Extrinsic Calibration of Event-LiDAR Dyad using Edge Correspondences

Calibrating the extrinsic parameters of sensory devices is crucial for f...
research
07/21/2022

Multi-Event-Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion

Event cameras are bio-inspired sensors that offer advantages over tradit...
research
10/16/2020

Learning Monocular Dense Depth from Events

Event cameras are novel sensors that output brightness changes in the fo...
research
10/06/2022

Unsupervised confidence for LiDAR depth maps and applications

Depth perception is pivotal in many fields, such as robotics and autonom...
research
05/16/2018

Photorealistic Image Reconstruction from Hybrid Intensity and Event based Sensor

Event based sensors encode pixel wise contrast changes as positive or ne...
research
10/26/2022

CU-Net: LiDAR Depth-Only Completion With Coupled U-Net

LiDAR depth-only completion is a challenging task to estimate dense dept...

Please sign up or login with your details

Forgot password? Click here to reset