K-Radar: 4D Radar Object Detection Dataset and Benchmark for Autonomous Driving in Various Weather Conditions

06/16/2022
by   Dong-Hee Paek, et al.
0

Unlike RGB cameras that use visible light bands (384∼769 THz) and Lidar that use infrared bands (361∼331 THz), Radars use relatively longer wavelength radio bands (77∼81 GHz), resulting in robust measurements in adverse weathers. Unfortunately, existing Radar datasets only contain a relatively small number of samples compared to the existing camera and Lidar datasets. This may hinder the development of sophisticated data-driven deep learning techniques for Radar-based perception. Moreover, most of the existing Radar datasets only provide 3D Radar tensor (3DRT) data that contain power measurements along the Doppler, range, and azimuth dimensions. As there is no elevation information, it is challenging to estimate the 3D bounding box of an object from 3DRT. In this work, we introduce KAIST-Radar (K-Radar), a novel large-scale object detection dataset and benchmark that contains 35K frames of 4D Radar tensor (4DRT) data with power measurements along the Doppler, range, azimuth, and elevation dimensions, together with carefully annotated 3D bounding box labels of objects on the roads. K-Radar includes challenging driving conditions such as adverse weathers (fog, rain, and snow) on various road structures (urban, suburban roads, alleyways, and highways). In addition to the 4DRT, we provide auxiliary measurements from carefully calibrated high-resolution Lidars, surround stereo cameras, and RTK-GPS. We also provide 4DRT-based object detection baseline neural networks (baseline NNs) and show that the height information is crucial for 3D object detection. And by comparing the baseline NN with a similarly-structured Lidar-based neural network, we demonstrate that 4D Radar is a more robust sensor for adverse weather conditions. All codes are available at https://github.com/kaist-avelab/k-radar.

READ FULL TEXT

page 3

page 5

page 6

page 10

page 11

page 14

page 15

page 17

research
10/18/2020

RADIATE: A Radar Dataset for Automotive Perception

Datasets for autonomous cars are essential for the development and bench...
research
11/11/2022

RaLiBEV: Radar and LiDAR BEV Fusion Learning for Anchor Box Free Object Detection System

Radar, the only sensor that could provide reliable perception capability...
research
04/17/2019

2D Car Detection in Radar Data with PointNets

For many automated driving functions, a highly accurate perception of th...
research
03/11/2023

Enhanced K-Radar: Optimal Density Reduction to Improve Detection Performance and Accessibility of 4D Radar Tensor-based Object Detection

Recent works have shown the superior robustness of four-dimensional (4D)...
research
04/17/2023

RadarFormer: Lightweight and Accurate Real-Time Radar Object Detection Model

The performance of perception systems developed for autonomous driving v...
research
03/18/2022

Boreas: A Multi-Season Autonomous Driving Dataset

The Boreas dataset was collected by driving a repeated route over the co...
research
05/11/2021

Rethinking of Radar's Role: A Camera-Radar Dataset and Systematic Annotator via Coordinate Alignment

Radar has long been a common sensor on autonomous vehicles for obstacle ...

Please sign up or login with your details

Forgot password? Click here to reset