VIR-SLAM: Visual, Inertial, and Ranging SLAM for single and multi-robot systems

05/31/2020
by   Yanjun Cao, et al.
0

Monocular cameras coupled with inertial measurements generally give high performance visual inertial odometry. However, drift can be significant with long trajectories, especially when the environment is visually challenging. In this paper, we propose a system that leverages ultra-wideband ranging with one static anchor placed in the environment to correct the accumulated error whenever the anchor is visible. We also use this setup for collaborative SLAM: different robots use mutual ranging (when available) and the common anchor to estimate the transformation between each other, facilitating map fusion Our system consists of two modules: a double layer ranging, visual, and inertial odometry for single robots, and a transformation estimation module for collaborative SLAM. We test our system on public datasets by simulating an ultra-wideband sensor as well as on real robots. Experiments show our method can outperform state-of-the-art visual-inertial odometry by more than 20 visually challenging environments, our method works even the visual-inertial odometry has significant drift Furthermore, we can compute the collaborative SLAM transformation matrix at almost no extra computation cost.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset