KD3A: Unsupervised Multi-Source Decentralized Domain Adaptation via Knowledge Distillation

11/19/2020
by   Hao-Zhe Feng, et al.
1

Conventional unsupervised multi-source domain adaptation(UMDA) methods assume all source domains can be accessed directly. This neglects the privacy-preserving policy, that is,all the data and computations must be kept decentralized.There exists three problems in this scenario: (1)Minimizing the domain distance requires the pairwise calculation of the data from source and target domains, which is not accessible.(2)The communication cost and privacy security limit the application of UMDA methods (e.g.,the domain adversarial training).(3)Since users have no authority to checkthe data quality, the irrelevant or malicious source domainsare more likely to appear, which causes negative transfer. In this study, we propose a privacy-preserving UMDA paradigm named Knowledge Distillation based Decentralized Domain Adaptation(KD3A), which performs domain adaptation through the knowledge distillation on models from different source domains. KD3A solves the above problems with three components:(1)A multi-source knowledge distillation method named Knowledge Voteto learn high-quality domain consensus knowledge. (2)A dynamic weighting strategy named Consensus Focusto identify both the malicious and irrelevant domains.(3)A decentralized optimization strategy for computing domain distance named BatchNorm MMD.The extensive experiments on DomainNet demonstrate that KD3A is robust to the negative transfer and brings a 100x reduction of communication cost compared with other decentralized UMDA methods. Moreover, our KD3A significantly outperforms state-of-the-art UMDA approaches.

READ FULL TEXT

page 3

page 6

page 7

research
03/07/2022

Student Become Decathlon Master in Retinal Vessel Segmentation via Dual-teacher Multi-target Domain Adaptation

Unsupervised domain adaptation has been proposed recently to tackle the ...
research
07/21/2022

Federated Semi-Supervised Domain Adaptation via Knowledge Transfer

Given the rapidly changing machine learning environments and expensive d...
research
10/22/2020

Knowledge Distillation for BERT Unsupervised Domain Adaptation

A pre-trained language model, BERT, has brought significant performance ...
research
05/28/2021

FReTAL: Generalizing Deepfake Detection using Knowledge Distillation and Representation Learning

As GAN-based video and image manipulation technologies become more sophi...
research
03/19/2023

AdaptGuard: Defending Against Universal Attacks for Model Adaptation

Model adaptation aims at solving the domain transfer problem under the c...
research
03/26/2021

Weakly-Supervised Domain Adaptation of Deep Regression Trackers via Reinforced Knowledge Distillation

Deep regression trackers are among the fastest tracking algorithms avail...
research
04/01/2016

Adapting Models to Signal Degradation using Distillation

Model compression and knowledge distillation have been successfully appl...

Please sign up or login with your details

Forgot password? Click here to reset