FAN-Trans: Online Knowledge Distillation for Facial Action Unit Detection

11/11/2022
by   Jing Yang, et al.
0

Due to its importance in facial behaviour analysis, facial action unit (AU) detection has attracted increasing attention from the research community. Leveraging the online knowledge distillation framework, we propose the “FANTrans" method for AU detection. Our model consists of a hybrid network of convolution and transformer blocks to learn per-AU features and to model AU co-occurrences. The model uses a pre-trained face alignment network as the feature extractor. After further transformation by a small learnable add-on convolutional subnet, the per-AU features are fed into transformer blocks to enhance their representation. As multiple AUs often appear together, we propose a learnable attention drop mechanism in the transformer block to learn the correlation between the features for different AUs. We also design a classifier that predicts AU presence by considering all AUs' features, to explicitly capture label dependencies. Finally, we make the attempt of adapting online knowledge distillation in the training stage for this task, further improving the model's performance. Experiments on the BP4D and DISFA datasets demonstrating the effectiveness of proposed method.

READ FULL TEXT

page 1

page 3

page 7

research
08/22/2022

Tree-structured Auxiliary Online Knowledge Distillation

Traditional knowledge distillation adopts a two-stage training process i...
research
02/21/2023

Two-in-one Knowledge Distillation for Efficient Facial Forgery Detection

Facial forgery detection is a crucial but extremely challenging topic, w...
research
07/09/2021

Action Unit Detection with Joint Adaptive Attention and Graph Relation

This paper describes an approach to the facial action unit (AU) detectio...
research
03/25/2023

Multi-view knowledge distillation transformer for human action recognition

Recently, Transformer-based methods have been utilized to improve the pe...
research
02/05/2020

Feature-map-level Online Adversarial Knowledge Distillation

Feature maps contain rich information about image intensity and spatial ...
research
08/03/2021

Linking Common Vulnerabilities and Exposures to the MITRE ATT CK Framework: A Self-Distillation Approach

Due to the ever-increasing threat of cyber-attacks to critical cyber inf...
research
07/28/2022

SDBERT: SparseDistilBERT, a faster and smaller BERT model

In this work we introduce a new transformer architecture called SparseDi...

Please sign up or login with your details

Forgot password? Click here to reset