Log In Sign Up

MBA-VO: Motion Blur Aware Visual Odometry

by   Peidong Liu, et al.

Motion blur is one of the major challenges remaining for visual odometry methods. In low-light conditions where longer exposure times are necessary, motion blur can appear even for relatively slow camera motions. In this paper we present a novel hybrid visual odometry pipeline with direct approach that explicitly models and estimates the camera's local trajectory within the exposure time. This allows us to actively compensate for any motion blur that occurs due to the camera motion. In addition, we also contribute a novel benchmarking dataset for motion blur aware visual odometry. In experiments we show that by directly modeling the image formation process, we are able to improve robustness of the visual odometry, while keeping comparable accuracy as that for images without motion blur.


page 1

page 5

page 7


Digital Gimbal: End-to-end Deep Image Stabilization with Learnable Exposure Times

Mechanical image stabilization using actuated gimbals enables capturing ...

Spatiotemporal Registration for Event-based Visual Odometry

A useful application of event sensing is visual odometry, especially in ...

BAD-NeRF: Bundle Adjusted Deblur Neural Radiance Fields

Neural Radiance Fields (NeRF) have received considerable attention recen...

Direct Visual Odometry using Bit-Planes

Feature descriptors, such as SIFT and ORB, are well-known for their robu...

Light Efficient Flutter Shutter

Flutter shutter is a technique in which the exposure is chopped into seg...

LODE: Deep Local Deblurring and A New Benchmark

While recent deep deblurring algorithms have achieved remarkable progres...

BALF: Simple and Efficient Blur Aware Local Feature Detector

Local feature detection is a key ingredient of many image processing and...