Contrastive Adaptation Network for Unsupervised Domain Adaptation

01/04/2019
by   Guoliang Kang, et al.
0

Unsupervised Domain Adaptation (UDA) makes predictions for the target domain data while manual annotations are only available in the source domain. Previous methods minimize the domain discrepancy neglecting the class information, which may lead to misalignment and poor generalization performance. To address this issue, this paper proposes Contrastive Adaptation Network (CAN) optimizing a new metric which explicitly models the intra-class domain discrepancy and the inter-class domain discrepancy. We design an alternating update strategy for training CAN in an end-to-end manner. Experiments on two real-world benchmarks Office-31 and VisDA-2017 demonstrate that CAN performs favorably against the state-of-the-art methods and produces more discriminative features. We will release the code soon.

READ FULL TEXT
research
09/11/2019

Contrastively Smoothed Class Alignment for Unsupervised Domain Adaptation

Recent unsupervised approaches to domain adaptation primarily focus on m...
research
11/26/2021

Contrastive Vicinal Space for Unsupervised Domain Adaptation

Utilizing vicinal space between the source and target domains is one of ...
research
05/26/2019

Learning Smooth Representation for Unsupervised Domain Adaptation

In unsupervised domain adaptation, existing methods utilizing the bounda...
research
05/24/2022

MetaSID: Singer Identification with Domain Adaptation for Metaverse

Metaverse has stretched the real world into unlimited space. There will ...
research
05/03/2022

Disentangled and Side-aware Unsupervised Domain Adaptation for Cross-dataset Subjective Tinnitus Diagnosis

EEG-based tinnitus classification is a valuable tool for tinnitus diagno...
research
08/14/2020

On Localized Discrepancy for Domain Adaptation

We propose the discrepancy-based generalization theories for unsupervise...
research
07/01/2020

Rethink Maximum Mean Discrepancy for Domain Adaptation

Existing domain adaptation methods aim to reduce the distributional diff...

Please sign up or login with your details

Forgot password? Click here to reset