FReTAL: Generalizing Deepfake Detection using Knowledge Distillation and Representation Learning

05/28/2021
by   Minha Kim, et al.
0

As GAN-based video and image manipulation technologies become more sophisticated and easily accessible, there is an urgent need for effective deepfake detection technologies. Moreover, various deepfake generation techniques have emerged over the past few years. While many deepfake detection methods have been proposed, their performance suffers from new types of deepfake methods on which they are not sufficiently trained. To detect new types of deepfakes, the model should learn from additional data without losing its prior knowledge about deepfakes (catastrophic forgetting), especially when new deepfakes are significantly different. In this work, we employ the Representation Learning (ReL) and Knowledge Distillation (KD) paradigms to introduce a transfer learning-based Feature Representation Transfer Adaptation Learning (FReTAL) method. We use FReTAL to perform domain adaptation tasks on new deepfake datasets while minimizing catastrophic forgetting. Our student model can quickly adapt to new types of deepfake by distilling knowledge from a pre-trained teacher model and applying transfer learning without using source domain data during domain adaptation. Through experiments on FaceForensics++ datasets, we demonstrate that FReTAL outperforms all baselines on the domain adaptation task with up to 86.97

READ FULL TEXT

page 1

page 4

research
07/06/2021

CoReD: Generalizing Fake Media Detection with Continual Representation using Distillation

Over the last few decades, artificial intelligence research has made tre...
research
04/19/2023

Knowledge Distillation Under Ideal Joint Classifier Assumption

Knowledge distillation is a powerful technique to compress large neural ...
research
04/13/2023

CoSDA: Continual Source-Free Domain Adaptation

Without access to the source data, source-free domain adaptation (SFDA) ...
research
04/01/2016

Adapting Models to Signal Degradation using Distillation

Model compression and knowledge distillation have been successfully appl...
research
07/13/2022

Domain adaptation strategies for cancer-independent detection of lymph node metastases

Recently, large, high-quality public datasets have led to the developmen...
research
11/19/2020

KD3A: Unsupervised Multi-Source Decentralized Domain Adaptation via Knowledge Distillation

Conventional unsupervised multi-source domain adaptation(UMDA) methods a...
research
04/13/2021

Unifying domain adaptation and self-supervised learning for CXR segmentation via AdaIN-based knowledge distillation

As the segmentation labels are scarce, extensive researches have been co...

Please sign up or login with your details

Forgot password? Click here to reset