Blending-target Domain Adaptation by Adversarial Meta-Adaptation Networks

07/08/2019
by   Ziliang Chen, et al.
0

(Unsupervised) Domain Adaptation (DA) seeks for classifying target instances when solely provided with source labeled and target unlabeled examples for training. Learning domain-invariant features helps to achieve this goal, whereas it underpins unlabeled samples drawn from a single or multiple explicit target domains (Multi-target DA). In this paper, we consider a more realistic transfer scenario: our target domain is comprised of multiple sub-targets implicitly blended with each other, so that learners could not identify which sub-target each unlabeled sample belongs to. This Blending-target Domain Adaptation (BTDA) scenario commonly appears in practice and threatens the validities of most existing DA algorithms, due to the presence of domain gaps and categorical misalignments among these hidden sub-targets. To reap the transfer performance gains in this new scenario, we propose Adversarial Meta-Adaptation Network (AMEAN). AMEAN entails two adversarial transfer learning processes. The first is a conventional adversarial transfer to bridge our source and mixed target domains. To circumvent the intra-target category misalignment, the second process presents as "learning to adapt": It deploys an unsupervised meta-learner receiving target data and their ongoing feature-learning feedbacks, to discover target clusters as our "meta-sub-target" domains. These meta-sub-targets auto-design our meta-sub-target DA loss, which empirically eliminates the implicit category mismatching in our mixed target. We evaluate AMEAN and a variety of DA algorithms in three benchmarks under the BTDA setup. Empirical results show that BTDA is a quite challenging transfer setup for most existing DA algorithms, yet AMEAN significantly outperforms these state-of-the-art baselines and effectively restrains the negative transfer effects in BTDA.

READ FULL TEXT
research
03/19/2021

ConDA: Continual Unsupervised Domain Adaptation

Domain Adaptation (DA) techniques are important for overcoming the domai...
research
05/21/2021

Unsupervised Multi-Target Domain Adaptation for Acoustic Scene Classification

It is well known that the mismatch between training (source) and test (t...
research
01/05/2022

Revisiting Deep Subspace Alignment for Unsupervised Domain Adaptation

Unsupervised domain adaptation (UDA) aims to transfer and adapt knowledg...
research
11/27/2021

On Learning Domain-Invariant Representations for Transfer Learning with Multiple Sources

Domain adaptation (DA) benefits from the rigorous theoretical works that...
research
07/03/2023

SAM-DA: UAV Tracks Anything at Night with SAM-Powered Domain Adaptation

Domain adaptation (DA) has demonstrated significant promise for real-tim...
research
12/08/2019

Less Confusion More Transferable: Minimum Class Confusion for Versatile Domain Adaptation

Domain Adaptation (DA) transfers a learning model from a labeled source ...
research
12/15/2020

Cluster, Split, Fuse, and Update: Meta-Learning for Open Compound Domain Adaptive Semantic Segmentation

Open compound domain adaptation (OCDA) is a domain adaptation setting, w...

Please sign up or login with your details

Forgot password? Click here to reset