Event Camera-based Visual Odometry for Dynamic Motion Tracking of a Legged Robot Using Adaptive Time Surface

05/15/2023
by   Shifan Zhu, et al.
0

Our paper proposes a direct sparse visual odometry method that combines event and RGB-D data to estimate the pose of agile-legged robots during dynamic locomotion and acrobatic behaviors. Event cameras offer high temporal resolution and dynamic range, which can eliminate the issue of blurred RGB images during fast movements. This unique strength holds a potential for accurate pose estimation of agile-legged robots, which has been a challenging problem to tackle. Our framework leverages the benefits of both RGB-D and event cameras to achieve robust and accurate pose estimation, even during dynamic maneuvers such as jumping and landing a quadruped robot, the Mini-Cheetah. Our major contributions are threefold: Firstly, we introduce an adaptive time surface (ATS) method that addresses the whiteout and blackout issue in conventional time surfaces by formulating pixel-wise decay rates based on scene complexity and motion speed. Secondly, we develop an effective pixel selection method that directly samples from event data and applies sample filtering through ATS, enabling us to pick pixels on distinct features. Lastly, we propose a nonlinear pose optimization formula that simultaneously performs 3D-2D alignment on both RGB-based and event-based maps and images, allowing the algorithm to fully exploit the benefits of both data streams. We extensively evaluate the performance of our framework on both public datasets and our own quadruped robot dataset, demonstrating its effectiveness in accurately estimating the pose of agile robots during dynamic movements.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 6

page 7

research
07/10/2021

Feature-based Event Stereo Visual Odometry

Event-based cameras are biologically inspired sensors that output events...
research
07/30/2020

Event-based Stereo Visual Odometry

Event-based cameras are bio-inspired vision sensors whose pixels work in...
research
03/06/2023

EvHandPose: Event-based 3D Hand Pose Estimation with Sparse Supervision

Event camera shows great potential in 3D hand pose estimation, especiall...
research
02/05/2022

DEVO: Depth-Event Camera Visual Odometry in Challenging Conditions

We present a novel real-time visual odometry framework for a stereo setu...
research
09/15/2022

A Temporal Densely Connected Recurrent Network for Event-based Human Pose Estimation

Event camera is an emerging bio-inspired vision sensors that report per-...
research
09/15/2023

Deformable Neural Radiance Fields using RGB and Event Cameras

Modeling Neural Radiance Fields for fast-moving deformable objects from ...
research
04/01/2019

The RGB-D Triathlon: Towards Agile Visual Toolboxes for Robots

Deep networks have brought significant advances in robot perception, ena...

Please sign up or login with your details

Forgot password? Click here to reset