Hybrid, Frame and Event based Visual Inertial Odometry for Robust, Autonomous Navigation of Quadrotors

09/19/2017
by   Antoni Rosinol Vidal, et al.
0

Event cameras are bio-inspired vision sensors that output pixel-level brightness changes instead of standard intensity frames. These cameras do not suffer from motion blur and have a very high dynamic range, which enables them to provide reliable visual information during high speed motions or in scenes characterized by high dynamic range. However, event cameras output only little information when the amount of motion is limited, such as in the case of almost still motion. Conversely, standard cameras provide instant and rich information about the environment most of the time (in low-speed and good lighting scenarios), but they fail severely in case of fast motions, or difficult lighting such as high dynamic range or low light scenes. In this paper, we present the first state estimation pipeline that leverages the complementary advantages of these two sensors by fusing in a tightly-coupled manner events, standard frames, and inertial measurements. We show on the publicly available Event Camera Dataset that our hybrid pipeline leads to an accuracy improvement of 130 visual-inertial systems, while still being computationally tractable. Furthermore, we use our pipeline to demonstrate - to the best of our knowledge - the first autonomous quadrotor flight using an event camera for state estimation, unlocking flight scenarios that were not reachable with traditional visual-inertial odometry, such as low-light environments and high-dynamic range scenes.

READ FULL TEXT

page 1

page 6

page 7

research
09/25/2022

PL-EVIO: Robust Monocular Event-based Visual Inertial Odometry with Point and Line Features

Event cameras are motion-activated sensors that capture pixel-level illu...
research
07/12/2016

Event-based, 6-DOF Camera Tracking from Photometric Depth Maps

Event cameras are bio-inspired vision sensors that output pixel-level br...
research
02/26/2021

Autonomous Quadrotor Flight despite Rotor Failure with Onboard Vision Sensors: Frames vs. Events

Fault-tolerant control is crucial for safety-critical systems, such as q...
research
02/23/2017

Continuous-Time Visual-Inertial Trajectory Estimation with Event Cameras

In contrast to traditional cameras, which output images at a fixed rate,...
research
04/12/2022

Exploring Event Camera-based Odometry for Planetary Robots

Due to their resilience to motion blur and high robustness in low-light ...
research
12/26/2022

ESVIO: Event-based Stereo Visual Inertial Odometry

Event cameras that asynchronously output low-latency event streams provi...
research
12/10/2020

An Asynchronous Kalman Filter for Hybrid Event Cameras

We present an Asynchronous Kalman Filter (AKF) to reconstruct High Dynam...

Please sign up or login with your details

Forgot password? Click here to reset