Radar Guided Dynamic Visual Attention for Resource-Efficient RGB Object Detection

06/03/2022
by   Hemant Kumawat, et al.
12

An autonomous system's perception engine must provide an accurate understanding of the environment for it to make decisions. Deep learning based object detection networks experience degradation in the performance and robustness for small and far away objects due to a reduction in object's feature map as we move to higher layers of the network. In this work, we propose a novel radar-guided spatial attention for RGB images to improve the perception quality of autonomous vehicles operating in a dynamic environment. In particular, our method improves the perception of small and long range objects, which are often not detected by the object detectors in RGB mode. The proposed method consists of two RGB object detectors, namely the Primary detector and a lightweight Secondary detector. The primary detector takes a full RGB image and generates primary detections. Next, the radar proposal framework creates regions of interest (ROIs) for object proposals by projecting the radar point cloud onto the 2D RGB image. These ROIs are cropped and fed to the secondary detector to generate secondary detections which are then fused with the primary detections via non-maximum suppression. This method helps in recovering the small objects by preserving the object's spatial features through an increase in their receptive field. We evaluate our fusion method on the challenging nuScenes dataset and show that our fusion method with SSD-lite as primary and secondary detector improves the baseline primary yolov3 detector's recall by 14 resources.

READ FULL TEXT

page 1

page 3

page 5

page 8

research
09/17/2020

Radar-Camera Sensor Fusion for Joint Object Detection and Distance Estimation in Autonomous Vehicles

In this paper we present a novel radar-camera sensor fusion framework fo...
research
05/01/2019

RRPN: Radar Region Proposal Network for Object Detection in Autonomous Vehicles

Region proposal algorithms play an important role in most state-of-the-a...
research
03/25/2023

Learned Two-Plane Perspective Prior based Image Resampling for Efficient Object Detection

Real-time efficient perception is critical for autonomous navigation and...
research
12/01/2022

Motion Informed Object Detection of Small Insects in Time-lapse Camera Recordings

Insects as pollinators play a key role in ecosystem management and world...
research
12/22/2021

YOLO-Z: Improving small object detection in YOLOv5 for autonomous vehicles

As autonomous vehicles and autonomous racing rise in popularity, so does...
research
11/22/2021

The EOSC-Synergy cloud services implementation for the Latin American Giant Observatory (LAGO)

The Latin American Giant Observatory (LAGO) is a distributed cosmic ray ...
research
08/05/2020

Active Perception using Light Curtains for Autonomous Driving

Most real-world 3D sensors such as LiDARs perform fixed scans of the ent...

Please sign up or login with your details

Forgot password? Click here to reset