Weight Distillation: Transferring the Knowledge in Neural Network Parameters

09/19/2020
by   Ye Lin, et al.
0

Knowledge distillation has been proven to be effective in model acceleration and compression. It allows a small network to learn to generalize in the same way as a large network. Recent successes in pre-training suggest the effectiveness of transferring model parameters. Inspired by this, we investigate methods of model acceleration and compression in another line of research. We propose Weight Distillation to transfer the knowledge in the large network parameters through a parameter generator. Our experiments on WMT16 En-Ro, NIST12 Zh-En, and WMT14 En-De machine translation tasks show that weight distillation can train a small network that is 1.88 2.94x faster than the large network but with competitive performance. With the same sized small network, weight distillation can outperform knowledge distillation by 0.51 1.82 BLEU points.

READ FULL TEXT

page 1

page 3

page 4

page 5

page 6

page 7

page 9

page 10

research
09/20/2023

Weight Averaging Improves Knowledge Distillation under Domain Shift

Knowledge distillation (KD) is a powerful model compression technique br...
research
06/09/2020

Knowledge Distillation: A Survey

In recent years, deep neural networks have been very successful in the f...
research
07/31/2022

Chinese grammatical error correction based on knowledge distillation

In view of the poor robustness of existing Chinese grammatical error cor...
research
08/01/2018

SlimNets: An Exploration of Deep Model Compression and Acceleration

Deep neural networks have achieved increasingly accurate results on a wi...
research
12/01/2016

In Teacher We Trust: Learning Compressed Models for Pedestrian Detection

Deep convolutional neural networks continue to advance the state-of-the-...
research
01/07/2022

Microdosing: Knowledge Distillation for GAN based Compression

Recently, significant progress has been made in learned image and video ...
research
08/25/2020

Discriminability Distillation in Group Representation Learning

Learning group representation is a commonly concerned issue in tasks whe...

Please sign up or login with your details

Forgot password? Click here to reset