Data-free Knowledge Distillation for Segmentation using Data-Enriching GAN

11/02/2020
by   Kaushal Bhogale, et al.
0

Distilling knowledge from huge pre-trained networks to improve the performance of tiny networks has favored deep learning models to be used in many real-time and mobile applications. Several approaches that demonstrate success in this field have made use of the true training dataset to extract relevant knowledge. In absence of the True dataset, however, extracting knowledge from deep networks is still a challenge. Recent works on data-free knowledge distillation demonstrate such techniques on classification tasks. To this end, we explore the task of data-free knowledge distillation for segmentation tasks. First, we identify several challenges specific to segmentation. We make use of the DeGAN training framework to propose a novel loss function for enforcing diversity in a setting where a few classes are underrepresented. Further, we explore a new training framework for performing knowledge distillation in a data-free setting. We get an improvement of 6.93 in Mean IoU over previous approaches.

READ FULL TEXT

page 5

page 7

research
03/11/2019

Structured Knowledge Distillation for Semantic Segmentation

In this paper, we investigate the knowledge distillation strategy for tr...
research
12/27/2019

DeGAN : Data-Enriching GAN for Retrieving Representative Samples from a Trained Classifier

In this era of digital information explosion, an abundance of data from ...
research
10/29/2018

A Closer Look at Deep Learning Heuristics: Learning rate restarts, Warmup and Distillation

The convergence rate and final performance of common deep learning model...
research
01/15/2021

Data Impressions: Mining Deep Models to Extract Samples for Data-free Applications

Pretrained deep models hold their learnt knowledge in the form of the mo...
research
04/03/2023

Domain Generalization for Crop Segmentation with Knowledge Distillation

In recent years, precision agriculture has gradually oriented farming cl...
research
09/30/2022

Using Knowledge Distillation to improve interpretable models in a retail banking context

This article sets forth a review of knowledge distillation techniques wi...
research
09/20/2023

EPTQ: Enhanced Post-Training Quantization via Label-Free Hessian

Quantization of deep neural networks (DNN) has become a key element in t...

Please sign up or login with your details

Forgot password? Click here to reset