Panoptic nuScenes: A Large-Scale Benchmark for LiDAR Panoptic Segmentation and Tracking

09/08/2021
by   Whye Kit Fong, et al.
0

Panoptic scene understanding and tracking of dynamic agents are essential for robots and automated vehicles to navigate in urban environments. As LiDARs provide accurate illumination-independent geometric depictions of the scene, performing these tasks using LiDAR point clouds provides reliable predictions. However, existing datasets lack diversity in the type of urban scenes and have a limited number of dynamic object instances which hinders both learning of these tasks as well as credible benchmarking of the developed methods. In this paper, we introduce the large-scale Panoptic nuScenes benchmark dataset that extends our popular nuScenes dataset with point-wise groundtruth annotations for semantic segmentation, panoptic segmentation, and panoptic tracking tasks. To facilitate comparison, we provide several strong baselines for each of these tasks on our proposed dataset. Moreover, we analyze the drawbacks of the existing metrics for panoptic tracking and propose the novel instance-centric PAT metric that addresses the concerns. We present exhaustive experiments that demonstrate the utility of Panoptic nuScenes compared to existing datasets and make the online evaluation server available at nuScenes.org. We believe that this extension will accelerate the research of novel methods for scene understanding of dynamic urban environments.

READ FULL TEXT

page 1

page 3

page 9

research
03/18/2020

Toronto-3D: A Large-scale Mobile LiDAR Dataset for Semantic Segmentation of Urban Roadways

Semantic segmentation of large-scale outdoor point clouds is essential f...
research
02/24/2021

4D Panoptic LiDAR Segmentation

Temporal semantic scene understanding is critical for self-driving cars ...
research
03/07/2023

OpenOccupancy: A Large Scale Benchmark for Surrounding Semantic Occupancy Perception

Semantic occupancy perception is essential for autonomous driving, as au...
research
06/03/2021

Towards urban scenes understanding through polarization cues

Autonomous robotics is critically affected by the robustness of its scen...
research
03/16/2023

SLOPER4D: A Scene-Aware Dataset for Global 4D Human Pose Estimation in Urban Environments

We present SLOPER4D, a novel scene-aware dataset collected in large urba...
research
04/17/2020

MOPT: Multi-Object Panoptic Tracking

Comprehensive understanding of dynamic scenes is a critical prerequisite...
research
04/27/2023

SMAT: A Self-Reinforcing Framework for Simultaneous Mapping and Tracking in Unbounded Urban Environments

With the increasing prevalence of robots in daily life, it is crucial to...

Please sign up or login with your details

Forgot password? Click here to reset