Graph Distillation for Action Detection with Privileged Information

11/30/2017
by   Zelun Luo, et al.
0

In this work, we propose a technique that tackles the video understanding problem under a realistic, demanding condition in which we have limited labeled data and partially observed training modalities. Common methods such as transfer learning do not take advantage of the rich information from extra modalities potentially available in the source domain dataset. On the other hand, previous work on cross-modality learning only focuses on a single domain or task. In this work, we propose a graph-based distillation method that incorporates rich privileged information from a large multi-modal dataset in the source domain, and shows an improved performance in the target domain where data is scarce. Leveraging both a large-scale dataset and its extra modalities, our method learns a better model for temporal action detection and action classification without needing to have access to these modalities during test time. We evaluate our approach on action classification and temporal action detection tasks, and show that our models achieve the state-of-the-art performance on the PKU-MMD and NTU RGB+D datasets.

READ FULL TEXT

page 1

page 7

research
08/06/2021

Feature-Supervised Action Modality Transfer

This paper strives for action recognition and detection in video modalit...
research
08/08/2017

MHTN: Modal-adversarial Hybrid Transfer Network for Cross-modal Retrieval

Cross-modal retrieval has drawn wide interest for retrieval across diffe...
research
04/18/2023

A Two-Stage Framework with Self-Supervised Distillation For Cross-Domain Text Classification

Cross-domain text classification aims to adapt models to a target domain...
research
07/02/2015

Cross Modal Distillation for Supervision Transfer

In this work we propose a technique that transfers supervision between i...
research
08/08/2021

Learning an Augmented RGB Representation with Cross-Modal Knowledge Distillation for Action Detection

In video understanding, most cross-modal knowledge distillation (KD) met...
research
10/18/2016

From Traditional to Modern : Domain Adaptation for Action Classification in Short Social Video Clips

Short internet video clips like vines present a significantly wild distr...
research
10/19/2018

Learning with privileged information via adversarial discriminative modality distillation

Heterogeneous data modalities can provide complementary cues for several...

Please sign up or login with your details

Forgot password? Click here to reset