Structural and Statistical Texture Knowledge Distillation for Semantic Segmentation

05/06/2023
by   Deyi Ji, et al.
0

Existing knowledge distillation works for semantic segmentation mainly focus on transferring high-level contextual knowledge from teacher to student. However, low-level texture knowledge is also of vital importance for characterizing the local structural pattern and global statistical property, such as boundary, smoothness, regularity and color contrast, which may not be well addressed by high-level deep features. In this paper, we are intended to take full advantage of both structural and statistical texture knowledge and propose a novel Structural and Statistical Texture Knowledge Distillation (SSTKD) framework for semantic segmentation. Specifically, for structural texture knowledge, we introduce a Contourlet Decomposition Module (CDM) that decomposes low-level features with iterative Laplacian pyramid and directional filter bank to mine the structural texture knowledge. For statistical knowledge, we propose a Denoised Texture Intensity Equalization Module (DTIEM) to adaptively extract and enhance statistical texture knowledge through heuristics iterative quantization and denoised operation. Finally, each knowledge learning is supervised by an individual loss function, forcing the student network to mimic the teacher better from a broader perspective. Experiments show that the proposed method achieves state-of-the-art performance on Cityscapes, Pascal VOC 2012 and ADE20K datasets.

READ FULL TEXT

page 1

page 3

page 7

research
03/06/2021

Learning Statistical Texture for Semantic Segmentation

Existing semantic segmentation works mainly focus on learning the contex...
research
12/02/2022

StructVPR: Distill Structural Knowledge with Weighting Samples for Visual Place Recognition

Visual place recognition (VPR) is usually considered as a specific image...
research
07/12/2022

Normalized Feature Distillation for Semantic Segmentation

As a promising approach in model compression, knowledge distillation imp...
research
10/31/2019

Distilling Pixel-Wise Feature Similarities for Semantic Segmentation

Among the neural network compression techniques, knowledge distillation ...
research
09/07/2023

Towards Comparable Knowledge Distillation in Semantic Image Segmentation

Knowledge Distillation (KD) is one proposed solution to large model size...
research
07/18/2022

Learning Knowledge Representation with Meta Knowledge Distillation for Single Image Super-Resolution

Knowledge distillation (KD), which can efficiently transfer knowledge fr...
research
03/12/2019

Knowledge Adaptation for Efficient Semantic Segmentation

Both accuracy and efficiency are of significant importance to the task o...

Please sign up or login with your details

Forgot password? Click here to reset