An Empirical Analysis of the Impact of Data Augmentation on Knowledge Distillation

06/06/2020
by   Deepan Das, et al.
0

Generalization Performance of Deep Learning models trained using the Empirical Risk Minimization can be improved significantly by using Data Augmentation strategies such as simple transformations, or using Mixed Samples. In this work, we attempt to empirically analyse the impact of such augmentation strategies on the transfer of generalization between teacher and student models in a distillation setup. We observe that if a teacher is trained using any of the mixed sample augmentation strategies, the student model distilled from it is impaired in its generalization capabilities. We hypothesize that such strategies limit a model's capability to learn example-specific features, leading to a loss in quality of the supervision signal during distillation, without impacting it's standalone prediction performance. We present a novel KL-Divergence based metric to quantitatively measure the generalization capacity of the different networks.

READ FULL TEXT

page 2

page 5

page 9

research
05/24/2023

HARD: Hard Augmentations for Robust Distillation

Knowledge distillation (KD) is a simple and successful method to transfe...
research
06/24/2022

Online Distillation with Mixed Sample Augmentation

Mixed Sample Regularization (MSR), such as MixUp or CutMix, is a powerfu...
research
03/13/2023

Visual-Policy Learning through Multi-Camera View to Single-Camera View Knowledge Distillation for Robot Manipulation Tasks

The use of multi-camera views simultaneously has been shown to improve t...
research
04/12/2021

Generalization bounds via distillation

This paper theoretically investigates the following empirical phenomenon...
research
12/06/2019

Explaining Sequence-Level Knowledge Distillation as Data-Augmentation for Neural Machine Translation

Sequence-level knowledge distillation (SLKD) is a model compression tech...
research
03/10/2020

SuperMix: Supervising the Mixing Data Augmentation

In this paper, we propose a supervised mixing augmentation method, terme...
research
07/03/2021

Isotonic Data Augmentation for Knowledge Distillation

Knowledge distillation uses both real hard labels and soft labels predic...

Please sign up or login with your details

Forgot password? Click here to reset