WaterScenes: A Multi-Task 4D Radar-Camera Fusion Dataset and Benchmark for Autonomous Driving on Water Surfaces

07/13/2023
by   Shanliang Yao, et al.
0

Autonomous driving on water surfaces plays an essential role in executing hazardous and time-consuming missions, such as maritime surveillance, survivors rescue, environmental monitoring, hydrography mapping and waste cleaning. This work presents WaterScenes, the first multi-task 4D radar-camera fusion dataset for autonomous driving on water surfaces. Equipped with a 4D radar and a monocular camera, our Unmanned Surface Vehicle (USV) proffers all-weather solutions for discerning object-related information, including color, shape, texture, range, velocity, azimuth, and elevation. Focusing on typical static and dynamic objects on water surfaces, we label the camera images and radar point clouds at pixel-level and point-level, respectively. In addition to basic perception tasks, such as object detection, instance segmentation and semantic segmentation, we also provide annotations for free-space segmentation and waterline segmentation. Leveraging the multi-task and multi-modal data, we conduct numerous experiments on the single modality of radar and camera, as well as the fused modalities. Results demonstrate that 4D radar-camera fusion can considerably enhance the robustness of perception on water surfaces, especially in adverse lighting and weather conditions. WaterScenes dataset is public on https://waterscenes.github.io.

READ FULL TEXT

page 1

page 3

page 6

page 8

research
04/20/2023

Radar-Camera Fusion for Object Detection and Semantic Segmentation in Autonomous Driving: A Comprehensive Review

Driven by deep learning techniques, perception technology in autonomous ...
research
07/17/2023

Multi-Task Cross-Modality Attention-Fusion for 2D Object Detection

Accurate and robust object detection is critical for autonomous driving....
research
07/14/2023

Achelous: A Fast Unified Water-surface Panoptic Perception Framework based on Fusion of Monocular Camera and 4D mmWave Radar

Current perception models for different tasks usually exist in modular f...
research
08/20/2023

Efficient-VRNet: An Exquisite Fusion Network for Riverway Panoptic Perception based on Asymmetric Fair Fusion of Vision and 4D mmWave Radar

Panoptic perception is essential to unmanned surface vehicles (USVs) for...
research
03/09/2021

Are We Ready for Unmanned Surface Vehicles in Inland Waterways? The USVInland Multisensor Dataset and Benchmark

Unmanned surface vehicles (USVs) have great value with their ability to ...
research
06/01/2022

Towards Deep Radar Perception for Autonomous Driving: Datasets, Methods, and Challenges

With recent developments, the performance of automotive radar has improv...
research
12/07/2022

Gaussian Radar Transformer for Semantic Segmentation in Noisy Radar Data

Scene understanding is crucial for autonomous robots in dynamic environm...

Please sign up or login with your details

Forgot password? Click here to reset