DeepAI
Log In Sign Up

Sparse Teachers Can Be Dense with Knowledge

10/08/2022
by   Yi Yang, et al.
4

Recent advances in distilling pretrained language models have discovered that, besides the expressiveness of knowledge, the student-friendliness should be taken into consideration to realize a truly knowledgable teacher. Based on a pilot study, we find that over-parameterized teachers can produce expressive yet student-unfriendly knowledge, and are thus limited in overall knowledgableness. To remove the parameters that result in student-unfriendliness, we propose a sparse teacher trick under the guidance of an overall knowledgable score for each teacher parameter. The knowledgable score is essentially an interpolation of the expressiveness and student-friendliness scores. The aim is to ensure that the expressive parameters are retained while the student-unfriendly ones are removed. Extensive experiments on the GLUE benchmark show that the proposed sparse teachers can be dense with knowledge and lead to students with compelling performance in comparison with a series of competitive baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/12/2018

Born Again Neural Networks

Knowledge distillation (KD) consists of transferring knowledge from one ...
07/03/2022

PrUE: Distilling Knowledge from Sparse Teacher Networks

Although deep neural networks have enjoyed remarkable success across a w...
01/30/2020

Search for Better Students to Learn Distilled Knowledge

Knowledge Distillation, as a model compression technique, has received g...
03/23/2021

Teacher-Explorer-Student Learning: A Novel Learning Method for Open Set Recognition

If an unknown example that is not seen during training appears, most rec...
05/15/2018

Knowledge Distillation in Generations: More Tolerant Teachers Educate Better Students

This paper studies teacher-student optimization on neural networks, i.e....
12/25/2019

Learning performance in inverse Ising problems with sparse teacher couplings

We investigate the learning performance of the pseudolikelihood maximiza...
11/13/2020

The Teacher-Student Chatroom Corpus

The Teacher-Student Chatroom Corpus (TSCC) is a collection of written co...