How many Observations are Enough? Knowledge Distillation for Trajectory Forecasting

03/09/2022
by   Alessio Monti, et al.
13

Accurate prediction of future human positions is an essential task for modern video-surveillance systems. Current state-of-the-art models usually rely on a "history" of past tracked locations (e.g., 3 to 5 seconds) to predict a plausible sequence of future locations (e.g., up to the next 5 seconds). We feel that this common schema neglects critical traits of realistic applications: as the collection of input trajectories involves machine perception (i.e., detection and tracking), incorrect detection and fragmentation errors may accumulate in crowded scenes, leading to tracking drifts. On this account, the model would be fed with corrupted and noisy input data, thus fatally affecting its prediction performance. In this regard, we focus on delivering accurate predictions when only few input observations are used, thus potentially lowering the risks associated with automatic perception. To this end, we conceive a novel distillation strategy that allows a knowledge transfer from a teacher network to a student one, the latter fed with fewer observations (just two ones). We show that a properly defined teacher supervision allows a student network to perform comparably to state-of-the-art approaches that demand more observations. Besides, extensive experiments on common trajectory forecasting datasets highlight that our student network better generalizes to unseen scenarios.

READ FULL TEXT

page 4

page 8

research
05/15/2023

Distilling Knowledge for Short-to-Long Term Trajectory Prediction

Long-term trajectory forecasting is a challenging problem in the field o...
research
11/18/2020

Privileged Knowledge Distillation for Online Action Detection

Online Action Detection (OAD) in videos is proposed as a per-frame label...
research
11/08/2020

Ensembled CTR Prediction via Knowledge Distillation

Recently, deep learning-based models have been widely studied for click-...
research
06/12/2020

Knowledge Distillation Meets Self-Supervision

Knowledge distillation, which involves extracting the "dark knowledge" f...
research
07/08/2020

Robust Re-Identification by Multiple Views Knowledge Distillation

To achieve robustness in Re-Identification, standard methods leverage tr...
research
06/01/2023

Teacher Agent: A Non-Knowledge Distillation Method for Rehearsal-based Video Incremental Learning

With the rise in popularity of video-based social media, new categories ...
research
06/25/2023

Enhancing Mapless Trajectory Prediction through Knowledge Distillation

Scene information plays a crucial role in trajectory forecasting systems...

Please sign up or login with your details

Forgot password? Click here to reset