DeepAI AI Chat
Log In Sign Up

Blending-target Domain Adaptation by Adversarial Meta-Adaptation Networks

by   Ziliang Chen, et al.

(Unsupervised) Domain Adaptation (DA) seeks for classifying target instances when solely provided with source labeled and target unlabeled examples for training. Learning domain-invariant features helps to achieve this goal, whereas it underpins unlabeled samples drawn from a single or multiple explicit target domains (Multi-target DA). In this paper, we consider a more realistic transfer scenario: our target domain is comprised of multiple sub-targets implicitly blended with each other, so that learners could not identify which sub-target each unlabeled sample belongs to. This Blending-target Domain Adaptation (BTDA) scenario commonly appears in practice and threatens the validities of most existing DA algorithms, due to the presence of domain gaps and categorical misalignments among these hidden sub-targets. To reap the transfer performance gains in this new scenario, we propose Adversarial Meta-Adaptation Network (AMEAN). AMEAN entails two adversarial transfer learning processes. The first is a conventional adversarial transfer to bridge our source and mixed target domains. To circumvent the intra-target category misalignment, the second process presents as "learning to adapt": It deploys an unsupervised meta-learner receiving target data and their ongoing feature-learning feedbacks, to discover target clusters as our "meta-sub-target" domains. These meta-sub-targets auto-design our meta-sub-target DA loss, which empirically eliminates the implicit category mismatching in our mixed target. We evaluate AMEAN and a variety of DA algorithms in three benchmarks under the BTDA setup. Empirical results show that BTDA is a quite challenging transfer setup for most existing DA algorithms, yet AMEAN significantly outperforms these state-of-the-art baselines and effectively restrains the negative transfer effects in BTDA.


ConDA: Continual Unsupervised Domain Adaptation

Domain Adaptation (DA) techniques are important for overcoming the domai...

Unsupervised Multi-Target Domain Adaptation for Acoustic Scene Classification

It is well known that the mismatch between training (source) and test (t...

Revisiting Deep Subspace Alignment for Unsupervised Domain Adaptation

Unsupervised domain adaptation (UDA) aims to transfer and adapt knowledg...

On Learning Domain-Invariant Representations for Transfer Learning with Multiple Sources

Domain adaptation (DA) benefits from the rigorous theoretical works that...

SAM-DA: UAV Tracks Anything at Night with SAM-Powered Domain Adaptation

Domain adaptation (DA) has demonstrated significant promise for real-tim...

Less Confusion More Transferable: Minimum Class Confusion for Versatile Domain Adaptation

Domain Adaptation (DA) transfers a learning model from a labeled source ...

Cluster, Split, Fuse, and Update: Meta-Learning for Open Compound Domain Adaptive Semantic Segmentation

Open compound domain adaptation (OCDA) is a domain adaptation setting, w...