Category Contrast for Unsupervised Domain Adaptation in Visual Tasks

06/05/2021
by   Jiaxing Huang, et al.
0

Instance contrast for unsupervised representation learning has achieved great success in recent years. In this work, we explore the idea of instance contrastive learning in unsupervised domain adaptation (UDA) and propose a novel Category Contrast technique (CaCo) that introduces semantic priors on top of instance discrimination for visual UDA tasks. By considering instance contrastive learning as a dictionary look-up operation, we construct a semantics-aware dictionary with samples from both source and target domains where each target sample is assigned a (pseudo) category label based on the category priors of source samples. This allows category contrastive learning (between target queries and the category-level dictionary) for category-discriminative yet domain-invariant feature representations: samples of the same category (from either source or target domain) are pulled closer while those of different categories are pushed apart simultaneously. Extensive UDA experiments in multiple visual tasks (e.g., segmentation, classification and detection) show that the simple implementation of CaCo achieves superior performance as compared with the highly-optimized state-of-the-art methods. Analytically and empirically, the experiments also demonstrate that CaCo is complementary to existing UDA methods and generalizable to other learning setups such as semi-supervised learning, unsupervised model adaptation, etc.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/07/2021

Model Adaptation: Historical Contrastive Learning for Unsupervised Domain Adaptation without Source Data

Unsupervised domain adaptation aims to align a labeled source domain and...
07/08/2022

Unsupervised Domain Adaptive Fundus Image Segmentation with Category-level Regularization

Existing unsupervised domain adaptation methods based on adversarial lea...
07/10/2022

Domain Confused Contrastive Learning for Unsupervised Domain Adaptation

In this work, we study Unsupervised Domain Adaptation (UDA) in a challen...
10/28/2021

Contrast and Mix: Temporal Contrastive Video Domain Adaptation with Background Mixing

Unsupervised domain adaptation which aims to adapt models trained on a l...
10/03/2020

Integrating Categorical Semantics into Unsupervised Domain Translation

While unsupervised domain translation (UDT) has seen a lot of success re...
12/14/2021

Transferrable Contrastive Learning for Visual Domain Adaptation

Self-supervised learning (SSL) has recently become the favorite among fe...
11/11/2021

Semantic-aware Representation Learning Via Probability Contrastive Loss

Recent feature contrastive learning (FCL) has shown promising performanc...