1st Place Solution to the EPIC-Kitchens Action Anticipation Challenge 2022

07/10/2022
by   Zeyu Jiang, et al.
0

In this report, we describe the technical details of our submission to the EPIC-Kitchens Action Anticipation Challenge 2022. In this competition, we develop the following two approaches. 1) Anticipation Time Knowledge Distillation using the soft labels learned by the teacher model as knowledge to guide the student network to learn the information of anticipation time; 2) Verb-Noun Relation Module for building the relationship between verbs and nouns. Our method achieves state-of-the-art results on the testing set of EPIC-Kitchens Action Anticipation Challenge 2022.

READ FULL TEXT

page 1

page 2

research
04/18/2023

Deep Collective Knowledge Distillation

Many existing studies on knowledge distillation have focused on methods ...
research
10/02/2020

Online Knowledge Distillation via Multi-branch Diversity Enhancement

Knowledge distillation is an effective method to transfer the knowledge ...
research
11/18/2020

Privileged Knowledge Distillation for Online Action Detection

Online Action Detection (OAD) in videos is proposed as a per-frame label...
research
08/18/2023

Unlimited Knowledge Distillation for Action Recognition in the Dark

Dark videos often lose essential information, which causes the knowledge...
research
07/08/2021

Technical Report for Valence-Arousal Estimation in ABAW2 Challenge

In this work, we describe our method for tackling the valence-arousal es...
research
06/22/2022

NVIDIA-UNIBZ Submission for EPIC-KITCHENS-100 Action Anticipation Challenge 2022

In this report, we describe the technical details of our submission for ...
research
07/28/2021

TransAction: ICL-SJTU Submission to EPIC-Kitchens Action Anticipation Challenge 2021

In this report, the technical details of our submission to the EPIC-Kitc...

Please sign up or login with your details

Forgot password? Click here to reset