PreTraM: Self-Supervised Pre-training via Connecting Trajectory and Map

04/21/2022
by   Chenfeng Xu, et al.
0

Deep learning has recently achieved significant progress in trajectory forecasting. However, the scarcity of trajectory data inhibits the data-hungry deep-learning models from learning good representations. While mature representation learning methods exist in computer vision and natural language processing, these pre-training methods require large-scale data. It is hard to replicate these approaches in trajectory forecasting due to the lack of adequate trajectory data (e.g., 34K samples in the nuScenes dataset). To work around the scarcity of trajectory data, we resort to another data modality closely related to trajectories-HD-maps, which is abundantly provided in existing datasets. In this paper, we propose PreTraM, a self-supervised pre-training scheme via connecting trajectories and maps for trajectory forecasting. Specifically, PreTraM consists of two parts: 1) Trajectory-Map Contrastive Learning, where we project trajectories and maps to a shared embedding space with cross-modal contrastive learning, and 2) Map Contrastive Learning, where we enhance map representation with contrastive learning on large quantities of HD-maps. On top of popular baselines such as AgentFormer and Trajectron++, PreTraM boosts their performance by 5.5 in FDE-10 on the challenging nuScenes dataset. We show that PreTraM improves data efficiency and scales well with model size.

READ FULL TEXT

page 2

page 19

page 20

research
09/18/2023

Pre-training on Synthetic Driving Data for Trajectory Prediction

Accumulating substantial volumes of real-world driving data proves pivot...
research
05/21/2022

Self-Supervised Speech Representation Learning: A Review

Although supervised deep learning has revolutionized speech and audio pr...
research
10/13/2020

CAPT: Contrastive Pre-Training for LearningDenoised Sequence Representations

Pre-trained self-supervised models such as BERT have achieved striking s...
research
06/02/2021

SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pre-Training

Tabular data underpins numerous high-impact applications of machine lear...
research
07/31/2022

COCOA: Cross Modality Contrastive Learning for Sensor Data

Self-Supervised Learning (SSL) is a new paradigm for learning discrimina...
research
09/02/2022

IMG2IMU: Applying Knowledge from Large-Scale Images to IMU Applications via Contrastive Learning

Recent advances in machine learning showed that pre-training representat...
research
12/21/2020

Social NCE: Contrastive Learning of Socially-aware Motion Representations

Learning socially-aware motion representations is at the core of recent ...

Please sign up or login with your details

Forgot password? Click here to reset