Robust Target Training for Multi-Source Domain Adaptation

10/04/2022
by   Zhongying Deng, et al.
0

Given multiple labeled source domains and a single target domain, most existing multi-source domain adaptation (MSDA) models are trained on data from all domains jointly in one step. Such an one-step approach limits their ability to adapt to the target domain. This is because the training set is dominated by the more numerous and labeled source domain data. The source-domain-bias can potentially be alleviated by introducing a second training step, where the model is fine-tuned with the unlabeled target domain data only using pseudo labels as supervision. However, the pseudo labels are inevitably noisy and when used unchecked can negatively impact the model performance. To address this problem, we propose a novel Bi-level Optimization based Robust Target Training (BORT^2) method for MSDA. Given any existing fully-trained one-step MSDA model, BORT^2 turns it to a labeling function to generate pseudo-labels for the target data and trains a target model using pseudo-labeled target data only. Crucially, the target model is a stochastic CNN which is designed to be intrinsically robust against label noise generated by the labeling function. Such a stochastic CNN models each target instance feature as a Gaussian distribution with an entropy maximization regularizer deployed to measure the label uncertainty, which is further exploited to alleviate the negative impact of noisy pseudo labels. Training the labeling function and the target model poses a nested bi-level optimization problem, for which we formulate an elegant solution based on implicit differentiation. Extensive experiments demonstrate that our proposed method achieves the state of the art performance on three MSDA benchmarks, including the large-scale DomainNet dataset. Our code will be available at <https://github.com/Zhongying-Deng/BORT2>

READ FULL TEXT
research
08/20/2023

GeT: Generative Target Structure Debiasing for Domain Adaptation

Domain adaptation (DA) aims to transfer knowledge from a fully labeled s...
research
04/20/2023

Noisy Universal Domain Adaptation via Divergence Optimization for Visual Recognition

To transfer the knowledge learned from a labeled source domain to an unl...
research
08/15/2021

ST3D++: Denoised Self-training for Unsupervised Domain Adaptation on 3D Object Detection

In this paper, we present a self-training method, named ST3D++, with a h...
research
08/03/2023

Consistency Regularization for Generalizable Source-free Domain Adaptation

Source-free domain adaptation (SFDA) aims to adapt a well-trained source...
research
07/08/2023

Learning Variational Neighbor Labels for Test-Time Domain Generalization

This paper strives for domain generalization, where models are trained e...
research
07/03/2020

Domain Adaptation without Source Data

Domain adaptation assumes that samples from source and target domains ar...
research
01/31/2023

When Source-Free Domain Adaptation Meets Learning with Noisy Labels

Recent state-of-the-art source-free domain adaptation (SFDA) methods hav...

Please sign up or login with your details

Forgot password? Click here to reset