Multi-Modal Lidar Dataset for Benchmarking General-Purpose Localization and Mapping Algorithms

03/07/2022
by   Qingqing Li, et al.
0

Lidar technology has evolved significantly over the last decade, with higher resolution, better accuracy, and lower cost devices available today. In addition, new scanning modalities and novel sensor technologies have emerged in recent years. Public datasets have enabled benchmarking of algorithms and have set standards for the cutting edge technology. However, existing datasets are not representative of the technological landscape, with only a reduced number of lidars available. This inherently limits the development and comparison of general-purpose algorithms in the evolving landscape. This paper presents a novel multi-modal lidar dataset with sensors showcasing different scanning modalities (spinning and solid-state), sensing technologies, and lidar cameras. The focus of the dataset is on low-drift odometry, with ground truth data available in both indoors and outdoors environment with sub-millimeter accuracy from a motion capture (MOCAP) system. For comparison in longer distances, we also include data recorded in larger spaces indoors and outdoors. The dataset contains point cloud data from spinning lidars and solid-state lidars. Also, it provides range images from high resolution spinning lidars, RGB and depth images from a lidar camera, and inertial data from built-in IMUs. This is, to the best of our knowledge, the lidar dataset with the most variety of sensors and environments where ground truth data is available. This dataset can be widely used in multiple research areas, such as 3D LiDAR simultaneous localization and mapping (SLAM), performance comparison between multi-modal lidars, appearance recognition and loop closure detection. The datasets are available at: https://github.com/TIERS/tiers-lidars-dataset.

READ FULL TEXT

page 1

page 3

page 4

page 7

research
10/03/2022

A Benchmark for Multi-Modal Lidar SLAM with Ground Truth in GNSS-Denied Environments

Lidar-based simultaneous localization and mapping (SLAM) approaches have...
research
03/05/2023

Robust Multi-Modal Multi-LiDAR-Inertial Odometry and Mapping for Indoor Environments

Integrating multiple LiDAR sensors can significantly enhance a robot's p...
research
03/12/2020

The Newer College Dataset: Handheld LiDAR, Inertial and Vision with Ground Truth

In this paper we present a large dataset with a variety of mobile mappin...
research
06/14/2023

Challenges of Indoor SLAM: A multi-modal multi-floor dataset for SLAM evaluation

Robustness in Simultaneous Localization and Mapping (SLAM) remains one o...
research
08/23/2023

Multi-Modal Multi-Task (3MT) Road Segmentation

Multi-modal systems have the capacity of producing more reliable results...
research
12/16/2021

Multi-Camera LiDAR Inertial Extension to the Newer College Dataset

In this paper, we present a multi-camera LiDAR inertial dataset of 4.5km...
research
06/03/2022

OdomBeyondVision: An Indoor Multi-modal Multi-platform Odometry Dataset Beyond the Visible Spectrum

This paper presents a multimodal indoor odometry dataset, OdomBeyondVisi...

Please sign up or login with your details

Forgot password? Click here to reset