MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition

08/11/2022
by   Chuanguang Yang, et al.
28

Unlike the conventional Knowledge Distillation (KD), Self-KD allows a network to learn knowledge from itself without any guidance from extra networks. This paper proposes to perform Self-KD from image Mixture (MixSKD), which integrates these two techniques into a unified framework. MixSKD mutually distills feature maps and probability distributions between the random pair of original images and their mixup images in a meaningful way. Therefore, it guides the network to learn cross-image knowledge by modelling supervisory signals from mixup images. Moreover, we construct a self-teacher network by aggregating multi-stage feature maps for providing soft labels to supervise the backbone classifier, further improving the efficacy of self-boosting. Experiments on image classification and transfer learning to object detection and semantic segmentation demonstrate that MixSKD outperforms other state-of-the-art Self-KD and data augmentation methods. The code is available at https://github.com/winycg/Self-KD-Lib.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/15/2021

Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge Distillation

Knowledge distillation is a method of transferring the knowledge from a ...
research
08/27/2020

MetaDistiller: Network Self-Boosting via Meta-Learned Top-Down Distillation

Knowledge Distillation (KD) has been one of the most popu-lar methods to...
research
09/07/2021

Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution

Knowledge distillation (KD) is an effective framework that aims to trans...
research
07/29/2021

Hierarchical Self-supervised Augmented Knowledge Distillation

Knowledge distillation often involves how to define and transfer knowled...
research
03/09/2023

Smooth and Stepwise Self-Distillation for Object Detection

Distilling the structured information captured in feature maps has contr...
research
12/02/2021

A Fast Knowledge Distillation Framework for Visual Recognition

While Knowledge Distillation (KD) has been recognized as a useful tool i...
research
04/19/2019

Feature Fusion for Online Mutual Knowledge Distillation

We propose a learning framework named Feature Fusion Learning (FFL) that...

Please sign up or login with your details

Forgot password? Click here to reset