Long-Term Occupancy Grid Prediction Using Recurrent Neural Networks

09/11/2018
by   Marcel Schreiber, et al.
0

We tackle the long-term prediction of scene evolution in a complex downtown scenario for automated driving based on Lidar grid fusion and recurrent neural networks (RNNs). A bird's eye view of the scene, including occupancy and velocity, is fed as a sequence to a RNN which is trained to predict future occupancy. The nature of prediction allows generation of multiple hours of training data without the need of manual labeling. Thus, the training strategy and loss function is designed for long sequences of real-world data (unbalanced, continuously changing situations, false labels, etc.). The deep CNN architecture comprises convolutional long short-term memories (ConvLSTMs) to separate static from dynamic regions and to predict dynamic objects in future frames. Novel recurrent skip connections show the ability to predict small occluded objects, i.e. pedestrians, and occluded static regions. Spatio-temporal correlations between grid cells are exploited to predict multimodal future paths and interactions between objects. Experiments also quantify improvements to our previous network, a Monte Carlo approach, and literature.

READ FULL TEXT
research
09/25/2019

Motion Estimation in Occupancy Grid Maps in Stationary Settings Using Recurrent Neural Networks

In this work, we tackle the problem of modeling the vehicle environment ...
research
09/16/2015

Recurrent Neural Networks for Driver Activity Anticipation via Sensory-Fusion Architecture

Anticipating the future actions of a human is a widely studied problem i...
research
11/17/2020

Dynamic Occupancy Grid Mapping with Recurrent Neural Networks

Modeling and understanding the environment is an essential task for auto...
research
09/29/2016

Deep Tracking on the Move: Learning to Track the World from a Moving Vehicle using Recurrent Neural Networks

This paper presents an end-to-end approach for tracking static and dynam...
research
08/22/2017

Twin Networks: Using the Future as a Regularizer

Being able to model long-term dependencies in sequential data, such as t...
research
04/14/2016

Learning Visual Storylines with Skipping Recurrent Neural Networks

What does a typical visit to Paris look like? Do people first take photo...
research
05/04/2020

Neural Networks and Value at Risk

Utilizing a generative regime switching framework, we perform Monte-Carl...

Please sign up or login with your details

Forgot password? Click here to reset