Black-box Source-free Domain Adaptation via Two-stage Knowledge Distillation

05/13/2023
by   Shuai Wang, et al.
0

Source-free domain adaptation aims to adapt deep neural networks using only pre-trained source models and target data. However, accessing the source model still has a potential concern about leaking the source data, which reveals the patient's privacy. In this paper, we study the challenging but practical problem: black-box source-free domain adaptation where only the outputs of the source model and target data are available. We propose a simple but effective two-stage knowledge distillation method. In Stage 1, we train the target model from scratch with soft pseudo-labels generated by the source model in a knowledge distillation manner. In Stage 2, we initialize another model as the new student model to avoid the error accumulation caused by noisy pseudo-labels. We feed the images with weak augmentation to the teacher model to guide the learning of the student model. Our method is simple and flexible, and achieves surprising results on three cross-domain segmentation tasks.

READ FULL TEXT
research
01/08/2021

Unsupervised Domain Adaptation of Black-Box Source Models

Unsupervised domain adaptation (UDA) aims to learn a model for unlabeled...
research
03/18/2023

Confidence Attention and Generalization Enhanced Distillation for Continuous Video Domain Adaptation

Continuous Video Domain Adaptation (CVDA) is a scenario where a source m...
research
05/28/2021

Transformer-Based Source-Free Domain Adaptation

In this paper, we study the task of source-free domain adaptation (SFDA)...
research
01/20/2021

Deep Epidemiological Modeling by Black-box Knowledge Distillation: An Accurate Deep Learning Model for COVID-19

An accurate and efficient forecasting system is imperative to the preven...
research
05/03/2021

Black-Box Dissector: Towards Erasing-based Hard-Label Model Stealing Attack

Model stealing attack aims to create a substitute model that steals the ...
research
04/13/2021

Unifying domain adaptation and self-supervised learning for CXR segmentation via AdaIN-based knowledge distillation

As the segmentation labels are scarce, extensive researches have been co...
research
07/24/2020

Dynamic Knowledge Distillation for Black-box Hypothesis Transfer Learning

In real world applications like healthcare, it is usually difficult to b...

Please sign up or login with your details

Forgot password? Click here to reset