Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer

04/29/2021
by   Zhiyuan Wu, et al.
0

Recent applications pose requirements of both cross-domain knowledge transfer and model compression to machine learning models due to insufficient training data and limited computational resources. In this paper, we propose a new knowledge distillation model, named Spirit Distillation (SD), which is a model compression method with multi-domain knowledge transfer. The compact student network mimics out a representation equivalent to the front part of the teacher network, through which the general knowledge can be transferred from the source domain (teacher) to the target domain (student). To further improve the robustness of the student, we extend SD to Enhanced Spirit Distillation (ESD) in exploiting a more comprehensive knowledge by introducing the proximity domain which is similar to the target domain for feature extraction. Results demonstrate that our method can boost mIOU and high-precision accuracy by 1.4 and 8.2 compact network with only 41.8

READ FULL TEXT
research
12/31/2021

Data-Free Knowledge Transfer: A Survey

In the last decade, many deep learning models have been well trained and...
research
07/07/2023

Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series Data

For many real-world time series tasks, the computational complexity of p...
research
01/28/2023

Few-shot Face Image Translation via GAN Prior Distillation

Face image translation has made notable progress in recent years. Howeve...
research
07/17/2023

Domain Knowledge Distillation from Large Language Model: An Empirical Study in the Autonomous Driving Domain

Engineering knowledge-based (or expert) systems require extensive manual...
research
03/25/2021

Spirit Distillation: Precise Real-time Prediction with Insufficient Data

Recent trend demonstrates the effectiveness of deep neural networks (DNN...
research
05/22/2023

Lion: Adversarial Distillation of Closed-Source Large Language Model

The practice of transferring knowledge from a sophisticated, closed-sour...
research
01/24/2022

AutoMC: Automated Model Compression based on Domain Knowledge and Progressive search strategy

Model compression methods can reduce model complexity on the premise of ...

Please sign up or login with your details

Forgot password? Click here to reset