Unifying domain adaptation and self-supervised learning for CXR segmentation via AdaIN-based knowledge distillation

04/13/2021
by   Yujin Oh, et al.
0

As the segmentation labels are scarce, extensive researches have been conducted to train segmentation networks without labels or with only limited labels. In particular, domain adaptation, self-supervised learning, and teacher-student architecture have been intro- duced to distill knowledge from various tasks to improve the segmentation performance. However, these approaches appear different from each other, so it is not clear how these seemingly different approaches can be combined for better performance. Inspired by the recent StarGANv2 for multi-domain image translation, here we propose a novel seg- mentation framework via AdaIN-based knowledge distillation, where a single generator with AdaIN layers is trained along with the AdaIN code generator and style encoder so that the generator can perform both domain adaptation and segmentation. Specifically, our framework is designed to deal with difficult situations in chest X-ray (CXR) seg- mentation tasks where segmentation masks are only available for normal CXR data, but the trained model should be applied for both normal and abnormal CXR images. Since a single generator is used for abnormal to normal domain conversion and segmentation by simply changing the AdaIN codes, the generator can synergistically learn the com- mon features to improve segmentation performance. Experimental results using CXR data confirm that the trained network can achieve the state-of-the art segmentation per- formance for both normal and abnormal CXR images.

READ FULL TEXT

page 3

page 8

research
04/24/2019

Bidirectional Learning for Domain Adaptation of Semantic Segmentation

Domain adaptation for semantic image segmentation is very necessary sinc...
research
06/07/2022

Self-Knowledge Distillation based Self-Supervised Learning for Covid-19 Detection from Chest X-Ray Images

The global outbreak of the Coronavirus 2019 (COVID-19) has overloaded wo...
research
05/13/2023

Black-box Source-free Domain Adaptation via Two-stage Knowledge Distillation

Source-free domain adaptation aims to adapt deep neural networks using o...
research
03/07/2022

Student Become Decathlon Master in Retinal Vessel Segmentation via Dual-teacher Multi-target Domain Adaptation

Unsupervised domain adaptation has been proposed recently to tackle the ...
research
09/30/2019

MLSL: Multi-Level Self-Supervised Learning for Domain Adaptation with Spatially Independent and Semantically Consistent Labeling

Most of the recent Deep Semantic Segmentation algorithms suffer from lar...
research
05/28/2021

FReTAL: Generalizing Deepfake Detection using Knowledge Distillation and Representation Learning

As GAN-based video and image manipulation technologies become more sophi...
research
09/16/2022

Self-Supervised Learning of Phenotypic Representations from Cell Images with Weak Labels

We propose WS-DINO as a novel framework to use weak label information in...

Please sign up or login with your details

Forgot password? Click here to reset