CoCo DistillNet: a Cross-layer Correlation Distillation Network for Pathological Gastric Cancer Segmentation

08/27/2021
by   Wenxuan Zou, et al.
0

In recent years, deep convolutional neural networks have made significant advances in pathology image segmentation. However, pathology image segmentation encounters with a dilemma in which the higher-performance networks generally require more computational resources and storage. This phenomenon limits the employment of high-accuracy networks in real scenes due to the inherent high-resolution of pathological images. To tackle this problem, we propose CoCo DistillNet, a novel Cross-layer Correlation (CoCo) knowledge distillation network for pathological gastric cancer segmentation. Knowledge distillation, a general technique which aims at improving the performance of a compact network through knowledge transfer from a cumbersome network. Concretely, our CoCo DistillNet models the correlations of channel-mixed spatial similarity between different layers and then transfers this knowledge from a pre-trained cumbersome teacher network to a non-trained compact student network. In addition, we also utilize the adversarial learning strategy to further prompt the distilling procedure which is called Adversarial Distillation (AD). Furthermore, to stabilize our training procedure, we make the use of the unsupervised Paraphraser Module (PM) to boost the knowledge paraphrase in the teacher network. As a result, extensive experiments conducted on the Gastric Cancer Segmentation Dataset demonstrate the prominent ability of CoCo DistillNet which achieves state-of-the-art performance.

READ FULL TEXT

page 1

page 7

research
03/16/2022

Graph Flow: Cross-layer Graph Flow Distillation for Dual-Efficient Medical Image Segmentation

With the development of deep convolutional neural networks, medical imag...
research
11/02/2022

LightVessel: Exploring Lightweight Coronary Artery Vessel Segmentation via Similarity Knowledge Distillation

In recent years, deep convolution neural networks (DCNNs) have achieved ...
research
03/18/2021

Similarity Transfer for Knowledge Distillation

Knowledge distillation is a popular paradigm for learning portable neura...
research
10/09/2021

Visualizing the embedding space to explain the effect of knowledge distillation

Recent research has found that knowledge distillation can be effective i...
research
07/19/2021

Double Similarity Distillation for Semantic Image Segmentation

The balance between high accuracy and high speed has always been a chall...
research
10/29/2021

Model Fusion of Heterogeneous Neural Networks via Cross-Layer Alignment

Layer-wise model fusion via optimal transport, named OTFusion, applies s...
research
03/12/2019

Knowledge Adaptation for Efficient Semantic Segmentation

Both accuracy and efficiency are of significant importance to the task o...

Please sign up or login with your details

Forgot password? Click here to reset