Paced-Curriculum Distillation with Prediction and Label Uncertainty for Image Segmentation

02/02/2023
by   Mobarakol Islam, et al.
0

Purpose: In curriculum learning, the idea is to train on easier samples first and gradually increase the difficulty, while in self-paced learning, a pacing function defines the speed to adapt the training progress. While both methods heavily rely on the ability to score the difficulty of data samples, an optimal scoring function is still under exploration. Methodology: Distillation is a knowledge transfer approach where a teacher network guides a student network by feeding a sequence of random samples. We argue that guiding student networks with an efficient curriculum strategy can improve model generalization and robustness. For this purpose, we design an uncertainty-based paced curriculum learning in self distillation for medical image segmentation. We fuse the prediction uncertainty and annotation boundary uncertainty to develop a novel paced-curriculum distillation (PCD). We utilize the teacher model to obtain prediction uncertainty and spatially varying label smoothing with Gaussian kernel to generate segmentation boundary uncertainty from the annotation. We also investigate the robustness of our method by applying various types and severity of image perturbation and corruption. Results: The proposed technique is validated on two medical datasets of breast ultrasound image segmentation and robotassisted surgical scene segmentation and achieved significantly better performance in terms of segmentation and robustness. Conclusion: P-CD improves the performance and obtains better generalization and robustness over the dataset shift. While curriculum learning requires extensive tuning of hyper-parameters for pacing function, the level of performance improvement suppresses this limitation.

READ FULL TEXT

page 4

page 8

page 9

research
12/22/2022

Confidence-Aware Paced-Curriculum Learning by Label Smoothing for Surgical Scene Understanding

Curriculum learning and self-paced learning are the training strategies ...
research
06/21/2021

Knowledge Distillation via Instance-level Sequence Learning

Recently, distillation approaches are suggested to extract general knowl...
research
09/15/2022

CES-KD: Curriculum-based Expert Selection for Guided Knowledge Distillation

Knowledge distillation (KD) is an effective tool for compressing deep cl...
research
03/16/2022

Graph Flow: Cross-layer Graph Flow Distillation for Dual-Efficient Medical Image Segmentation

With the development of deep convolutional neural networks, medical imag...
research
08/23/2021

Efficient Medical Image Segmentation Based on Knowledge Distillation

Recent advances have been made in applying convolutional neural networks...
research
08/29/2022

How to Teach: Learning Data-Free Knowledge Distillation from Curriculum

Data-free knowledge distillation (DFKD) aims at training lightweight stu...
research
08/01/2021

Style Curriculum Learning for Robust Medical Image Segmentation

The performance of deep segmentation models often degrades due to distri...

Please sign up or login with your details

Forgot password? Click here to reset