Multi-task learning (MTL), a learning paradigm to learn multiple related...
Recent success of Contrastive Language-Image Pre-training (CLIP) has sho...
The core of out-of-distribution (OOD) detection is to learn the
in-distr...
We revisit the one- and two-stage detector distillation tasks and presen...
Due to the vulnerability of deep neural networks (DNNs) to adversarial
e...
Effectively structuring deep knowledge plays a pivotal role in transfer ...
We study the vision transformer structure in the mobile level in this pa...
Knowledge distillation transfers knowledge from the teacher network to t...
Unsupervised representation learning with contrastive learning achieved ...
We propose a novel data augmentation method `GridMask' in this paper. It...