torchdistill: A Modular, Configuration-Driven Framework for Knowledge Distillation

11/25/2020
by   Yoshitomo Matsubara, et al.
0

While knowledge distillation (transfer) has been attracting attentions from the research community, the recent development in the fields has heightened the need for reproducible studies and highly generalized frameworks to lower barriers to such high-quality, reproducible deep learning research. Several researchers voluntarily published frameworks used in their knowledge distillation studies to help other interested researchers reproduce their original work. Such frameworks, however, are usually neither well generalized nor maintained, thus researchers are still required to write a lot of code to refactor/build on the frameworks for introducing new methods, models, datasets and designing experiments. In this paper, we present our developed open-source framework built on PyTorch and dedicated for knowledge distillation studies. The framework is designed to enable users to design experiments by a declarative PyYAML configuration file, and helps researchers complete the recently proposed ML Code Completeness Checklist. Using the developed framework, we demonstrate its various efficient training strategies, and implement a variety of knowledge distillation methods. We also reproduce some of their original experimental results on the ImageNet and COCO datasets presented at major machine learning conferences such as ICLR, NeurIPS, CVPR and ECCV, including recent state-of-the-art methods. All the source code, configurations, log files and the trained model weights are publicly available at https://github.com/yoshitomo-matsubara/torchdistill .

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/20/2019

The State of Knowledge Distillation for Classification

We survey various knowledge distillation (KD) strategies for simple clas...
research
11/30/2020

A Selective Survey on Versatile Knowledge Distillation Paradigm for Neural Network Models

This paper aims to provide a selective survey about knowledge distillati...
research
11/08/2022

Understanding the Role of Mixup in Knowledge Distillation: An Empirical Study

Mixup is a popular data augmentation technique based on creating new sam...
research
03/16/2022

Decoupled Knowledge Distillation

State-of-the-art distillation methods are mainly based on distilling dee...
research
11/30/2020

KD-Lib: A PyTorch library for Knowledge Distillation, Pruning and Quantization

In recent years, the growing size of neural networks has led to a vast a...
research
07/15/2023

SoccerKDNet: A Knowledge Distillation Framework for Action Recognition in Soccer Videos

Classifying player actions from soccer videos is a challenging problem, ...
research
10/27/2019

MOD: A Deep Mixture Model with Online Knowledge Distillation for Large Scale Video Temporal Concept Localization

In this paper, we present and discuss a deep mixture model with online k...

Please sign up or login with your details

Forgot password? Click here to reset