DROID-SLAM: Deep Visual SLAM for Monocular, Stereo, and RGB-D Cameras

08/24/2021
by   Zachary Teed, et al.
0

We introduce DROID-SLAM, a new deep learning based SLAM system. DROID-SLAM consists of recurrent iterative updates of camera pose and pixelwise depth through a Dense Bundle Adjustment layer. DROID-SLAM is accurate, achieving large improvements over prior work, and robust, suffering from substantially fewer catastrophic failures. Despite training on monocular video, it can leverage stereo or RGB-D video to achieve improved performance at test time. The URL to our open source code is https://github.com/princeton-vl/DROID-SLAM.

READ FULL TEXT

page 7

page 10

research
07/13/2022

Structure PLP-SLAM: Efficient Sparse Mapping and Localization using Point, Line and Plane for Monocular, RGB-D and Stereo Cameras

This paper demonstrates a visual SLAM system that utilizes point and lin...
research
10/20/2016

ORB-SLAM2: an Open-Source SLAM System for Monocular, Stereo and RGB-D Cameras

We present ORB-SLAM2 a complete SLAM system for monocular, stereo and RG...
research
04/14/2021

VOLDOR-SLAM: For the Times When Feature-Based or Direct Methods Are Not Good Enough

We present a dense-indirect SLAM system using external dense optical flo...
research
10/02/2019

OpenVSLAM: A Versatile Visual SLAM Framework

In this paper, we introduce OpenVSLAM, a visual SLAM framework with high...
research
12/05/2022

RGB-L: Enhancing Indirect Visual SLAM using LiDAR-based Dense Depth Maps

In this paper, we present a novel method for integrating 3D LiDAR depth ...
research
06/12/2023

Volume-DROID: A Real-Time Implementation of Volumetric Mapping with DROID-SLAM

This paper presents Volume-DROID, a novel approach for Simultaneous Loca...
research
07/22/2018

RGBiD-SLAM for Accurate Real-time Localisation and 3D Mapping

In this paper we present a complete SLAM system for RGB-D cameras, namel...

Please sign up or login with your details

Forgot password? Click here to reset