Joint learning of interpretation and distillation

05/24/2020
by   Jinchao Huang, et al.
2

The extra trust brought by the model interpretation has made it an indispensable part of machine learning systems. But to explain a distilled model's prediction, one may either work with the student model itself, or turn to its teacher model. This leads to a more fundamental question: if a distilled model should give a similar prediction for a similar reason as its teacher model on the same input? This question becomes even more crucial when the two models have dramatically different structure, taking GBDT2NN for example. This paper conducts an empirical study on the new approach to explaining each prediction of GBDT2NN, and how imitating the explanation can further improve the distillation process as an auxiliary learning task. Experiments on several benchmarks show that the proposed methods achieve better performance on both explanations and predictions.

READ FULL TEXT

page 14

page 15

page 17

page 18

research
12/19/2021

Controlling the Quality of Distillation in Response-Based Network Compression

The performance of a distillation-based compressed network is governed b...
research
06/17/2022

Revisiting Self-Distillation

Knowledge distillation is the procedure of transferring "knowledge" from...
research
03/06/2023

Students Parrot Their Teachers: Membership Inference on Model Distillation

Model distillation is frequently proposed as a technique to reduce the p...
research
09/19/2020

Introspective Learning by Distilling Knowledge from Online Self-explanation

In recent years, many explanation methods have been proposed to explain ...
research
10/30/2019

Distilling Black-Box Travel Mode Choice Model for Behavioral Interpretation

Machine learning has proved to be very successful for making predictions...
research
12/01/2020

Evaluating Explanations: How much do explanations from the teacher aid students?

While many methods purport to explain predictions by highlighting salien...
research
06/13/2022

Robust Distillation for Worst-class Performance

Knowledge distillation has proven to be an effective technique in improv...

Please sign up or login with your details

Forgot password? Click here to reset