Understanding Bird's-Eye View Semantic HD-Maps Using an Onboard Monocular Camera

12/05/2020
by   Yigit Baran Can, et al.
0

Autonomous navigation requires scene understanding of the action-space to move or anticipate events. For planner agents moving on the ground plane, such as autonomous vehicles, this translates to scene understanding in the bird's-eye view. However, the onboard cameras of autonomous cars are customarily mounted horizontally for a better view of the surrounding. In this work, we study scene understanding in the form of online estimation of semantic bird's-eye-view HD-maps using the video input from a single onboard camera. We study three key aspects of this task, image-level understanding, BEV level understanding, and the aggregation of temporal information. Based on these three pillars we propose a novel architecture that combines these three aspects. In our extensive experiments, we demonstrate that the considered aspects are complementary to each other for HD-map understanding. Furthermore, the proposed architecture significantly surpasses the current state-of-the-art.

READ FULL TEXT

page 1

page 3

page 4

page 6

page 8

page 9

page 10

page 11

research
10/05/2021

Structured Bird's-Eye-View Traffic Scene Understanding from Onboard Images

Autonomous navigation requires structured representation of the road net...
research
06/26/2017

Learning to Map Vehicles into Bird's Eye View

Awareness of the road scene is an essential component for both autonomou...
research
09/16/2022

V2HDM-Mono: A Framework of Building a Marking-Level HD Map with One or More Monocular Cameras

Marking-level high-definition maps (HD maps) are of great significance f...
research
03/30/2020

Predicting Semantic Map Representations from Images using Pyramid Occupancy Networks

Autonomous vehicles commonly rely on highly detailed birds-eye-view maps...
research
08/06/2021

Bird's-Eye-View Panoptic Segmentation Using Monocular Frontal View Images

Bird's-Eye-View (BEV) maps have emerged as one of the most powerful repr...
research
12/03/2018

The Right (Angled) Perspective: Improving the Understanding of Road Scenes using Boosted Inverse Perspective Mapping

Many tasks performed by autonomous vehicles such as road marking detecti...
research
04/04/2022

FoV-Net: Field-of-View Extrapolation Using Self-Attention and Uncertainty

The ability to make educated predictions about their surroundings, and a...

Please sign up or login with your details

Forgot password? Click here to reset