Distill and Fine-tune: Effective Adaptation from a Black-box Source Model

04/04/2021
by   Jian Liang, et al.
0

To alleviate the burden of labeling, unsupervised domain adaptation (UDA) aims to transfer knowledge in previous related labeled datasets (source) to a new unlabeled dataset (target). Despite impressive progress, prior methods always need to access the raw source data and develop data-dependent alignment approaches to recognize the target samples in a transductive learning manner, which may raise privacy concerns from source individuals. Several recent studies resort to an alternative solution by exploiting the well-trained white-box model instead of the raw data from the source domain, however, it may leak the raw data through generative adversarial training. This paper studies a practical and interesting setting for UDA, where only a black-box source model (i.e., only network predictions are available) is provided during adaptation in the target domain. Besides, different neural networks are even allowed to be employed for different domains. For this new problem, we propose a novel two-step adaptation framework called Distill and Fine-tune (Dis-tune). Specifically, Dis-tune first structurally distills the knowledge from the source model to a customized target model, then unsupervisedly fine-tunes the distilled model to fit the target domain. To verify the effectiveness, we consider two UDA scenarios (, closed-set and partial-set), and discover that Dis-tune achieves highly competitive performance to state-of-the-art approaches.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/16/2022

Unsupervised Domain Adaptation for Segmentation with Black-box Source Model

Unsupervised domain adaptation (UDA) has been widely used to transfer kn...
research
12/31/2022

Source-Free Unsupervised Domain Adaptation: A Survey

Unsupervised domain adaptation (UDA) via deep learning has attracted app...
research
12/14/2020

Source Data-absent Unsupervised Domain Adaptation through Hypothesis Transfer and Labeling Transfer

Unsupervised domain adaptation (UDA) aims to transfer knowledge from a r...
research
07/21/2021

Black-box Probe for Unsupervised Domain Adaptation without Model Transferring

In recent years, researchers have been paying increasing attention to th...
research
08/02/2023

Curriculum Guided Domain Adaptation in the Dark

Addressing the rising concerns of privacy and security, domain adaptatio...
research
01/19/2021

Source-free Domain Adaptation via Distributional Alignment by Matching Batch Normalization Statistics

In this paper, we propose a novel domain adaptation method for the sourc...
research
08/01/2023

Domain Adaptation based on Human Feedback for Enhancing Generative Model Denoising Abilities

How can we apply human feedback into generative model? As answer of this...

Please sign up or login with your details

Forgot password? Click here to reset