Distilling knowledge from convolutional neural networks (CNNs) is a
doub...
Knowledge distillation transfers knowledge from a large model to a small...
Although vision transformers (ViTs) have shown promising results in vari...
Semi-supervised learning (SSL) has attracted enormous attention due to i...
We present a simple domain generalization baseline, which wins second pl...
Existing deepfake detection methods perform poorly on face forgeries
gen...
Self-distillation exploits non-uniform soft supervision from itself duri...
This paper aims to interpret how deepfake detection models learn artifac...
State-of-the-art distillation methods are mainly based on distilling dee...
In this paper, we study the information theoretic bounds for exact recov...
Inspired by speech recognition, recent state-of-the-art algorithms mostl...
Previous approaches for scene text detection have already achieved promi...