Online Knowledge Distillation with Diverse Peers

12/01/2019
by   Defang Chen, et al.
20

Distillation is an effective knowledge-transfer technique that uses predicted distributions of a powerful teacher model as soft targets to train a less-parameterized student model. A pre-trained high capacity teacher, however, is not always available. Recently proposed online variants use the aggregated intermediate predictions of multiple student models as targets to train each student model. Although group-derived targets give a good recipe for teacher-free distillation, group members are homogenized quickly with simple aggregation functions, leading to early saturated solutions. In this work, we propose Online Knowledge Distillation with Diverse peers (OKDDip), which performs two-level distillation during training with multiple auxiliary peers and one group leader. In the first-level distillation, each auxiliary peer holds an individual set of aggregation weights generated with an attention-based mechanism to derive its own targets from predictions of other auxiliary peers. Learning from distinct target distributions helps to boost peer diversity for effectiveness of group-based distillation. The second-level distillation is performed to transfer the knowledge in the ensemble of auxiliary peers further to the group leader, i.e., the model used for inference. Experimental results show that the proposed framework consistently gives better performance than state-of-the-art approaches without sacrificing training or inference complexity, demonstrating the effectiveness of the proposed two-level distillation framework.

READ FULL TEXT

page 1

page 2

page 3

page 5

page 6

page 7

page 8

page 9

research
10/02/2020

Online Knowledge Distillation via Multi-branch Diversity Enhancement

Knowledge distillation is an effective method to transfer the knowledge ...
research
06/01/2022

ORC: Network Group-based Knowledge Distillation using Online Role Change

In knowledge distillation, since a single, omnipotent teacher network ca...
research
08/25/2020

Discriminability Distillation in Group Representation Learning

Learning group representation is a commonly concerned issue in tasks whe...
research
09/24/2019

FEED: Feature-level Ensemble for Knowledge Distillation

Knowledge Distillation (KD) aims to transfer knowledge in a teacher-stud...
research
12/16/2021

Knowledge Distillation Leveraging Alternative Soft Targets from Non-Parallel Qualified Speech Data

This paper describes a novel knowledge distillation framework that lever...
research
05/25/2023

Triplet Knowledge Distillation

In Knowledge Distillation, the teacher is generally much larger than the...
research
12/05/2021

Safe Distillation Box

Knowledge distillation (KD) has recently emerged as a powerful strategy ...

Please sign up or login with your details

Forgot password? Click here to reset