Log In Sign Up

UPSLAM: Union of Panoramas SLAM

by   Anthony Cowley, et al.

We present an empirical investigation of a new mapping system based on a graph of panoramic depth images. Panoramic images efficiently capture range measurements taken by a spinning lidar sensor, recording fine detail on the order of a few centimeters within maps of expansive scope on the order of tens of millions of cubic meters. The flexibility of the system is demonstrated by running the same mapping software against data collected by hand-carrying a sensor around a laboratory space at walking pace, moving it outdoors through a campus environment at running pace, driving the sensor on a small wheeled vehicle on- and off-road, flying the sensor through a forest, carrying it on the back of a legged robot navigating an underground coal mine, and mounting it on the roof of a car driven on public roads. The full 3D maps are built online with a median update time of less than ten milliseconds on an embedded NVIDIA Jetson AGX Xavier system.


page 2

page 4

page 5

page 6


Efficient Continuous-Time SLAM for 3D Lidar-Based Online Mapping

Modern 3D laser-range scanners have a high data rate, making online simu...

UWB/LiDAR Fusion For Cooperative Range-Only SLAM

We equip an ultra-wideband (UWB) node and a 2D LiDAR sensor a.k.a. 2D la...

LatentSLAM: unsupervised multi-sensor representation learning for localization and mapping

Biologically inspired algorithms for simultaneous localization and mappi...

Probabilistic Semantic Mapping for Urban Autonomous Driving Applications

Recent advancement in statistical learning and computational ability has...

Towards Self-Supervised High Level Sensor Fusion

In this paper, we present a framework to control a self-driving car by f...

A Unified 3D Mapping Framework using a 3D or 2D LiDAR

Simultaneous Localization and Mapping (SLAM) has been considered as a so...