Structured Knowledge Distillation for Semantic Segmentation

03/11/2019
by   Yifan Liu, et al.
0

In this paper, we investigate the knowledge distillation strategy for training small semantic segmentation networks by making use of large networks. We start from the straightforward scheme, pixel-wise distillation, which applies the distillation scheme adopted for image classification and performs knowledge distillation for each pixel separately. We further propose to distill the structured knowledge from large networks to small networks, which is motivated by that semantic segmentation is a structured prediction problem. We study two structured distillation schemes: (i) pair-wise distillation that distills the pairwise similarities, and (ii) holistic distillation that uses GAN to distill holistic knowledge. The effectiveness of our knowledge distillation approaches is demonstrated by extensive experiments on three scene parsing datasets: Cityscapes, Camvid and ADE20K.

READ FULL TEXT

page 3

page 6

page 7

page 8

research
05/07/2022

Distilling Inter-Class Distance for Semantic Segmentation

Knowledge distillation is widely adopted in semantic segmentation to red...
research
11/02/2020

Data-free Knowledge Distillation for Segmentation using Data-Enriching GAN

Distilling knowledge from huge pre-trained networks to improve the perfo...
research
07/19/2021

Double Similarity Distillation for Semantic Image Segmentation

The balance between high accuracy and high speed has always been a chall...
research
07/24/2023

A Good Student is Cooperative and Reliable: CNN-Transformer Collaborative Learning for Semantic Segmentation

In this paper, we strive to answer the question "how to collaboratively ...
research
02/24/2023

A Knowledge Distillation framework for Multi-Organ Segmentation of Medaka Fish in Tomographic Image

Morphological atlases are an important tool in organismal studies, and m...
research
02/07/2022

Measuring and Reducing Model Update Regression in Structured Prediction for NLP

Recent advance in deep learning has led to rapid adoption of machine lea...
research
02/11/2023

Dual Relation Knowledge Distillation for Object Detection

Knowledge distillation is an effective method for model compression. How...

Please sign up or login with your details

Forgot password? Click here to reset