Transferrable Contrastive Learning for Visual Domain Adaptation

12/14/2021
by   Yang Chen, et al.
0

Self-supervised learning (SSL) has recently become the favorite among feature learning methodologies. It is therefore appealing for domain adaptation approaches to consider incorporating SSL. The intuition is to enforce instance-level feature consistency such that the predictor becomes somehow invariant across domains. However, most existing SSL methods in the regime of domain adaptation usually are treated as standalone auxiliary components, leaving the signatures of domain adaptation unattended. Actually, the optimal region where the domain gap vanishes and the instance level constraint that SSL peruses may not coincide at all. From this point, we present a particular paradigm of self-supervised learning tailored for domain adaptation, i.e., Transferrable Contrastive Learning (TCL), which links the SSL and the desired cross-domain transferability congruently. We find contrastive learning intrinsically a suitable candidate for domain adaptation, as its instance invariance assumption can be conveniently promoted to cross-domain class-level invariance favored by domain adaptation tasks. Based on particular memory bank constructions and pseudo label strategies, TCL then penalizes cross-domain intra-class domain discrepancy between source and target through a clean and novel contrastive loss. The free lunch is, thanks to the incorporation of contrastive learning, TCL relies on a moving-averaged key encoder that naturally achieves a temporally ensembled version of pseudo labels for target data, which avoids pseudo label error propagation at no extra cost. TCL therefore efficiently reduces cross-domain gaps. Through extensive experiments on benchmarks (Office-Home, VisDA-2017, Digits-five, PACS and DomainNet) for both single-source and multi-source domain adaptation tasks, TCL has demonstrated state-of-the-art performances.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/18/2020

Cross-domain Self-supervised Learning for Domain Adaptation with Few Source Labels

Existing unsupervised domain adaptation methods aim to transfer knowledg...
research
11/12/2022

Divide and Contrast: Source-free Domain Adaptation via Adaptive Contrastive Learning

We investigate a practical domain adaptation task, called source-free do...
research
06/16/2023

UTOPIA: Unconstrained Tracking Objects without Preliminary Examination via Cross-Domain Adaptation

Multiple Object Tracking (MOT) aims to find bounding boxes and identitie...
research
02/28/2023

Self-Supervised Interest Transfer Network via Prototypical Contrastive Learning for Recommendation

Cross-domain recommendation has attracted increasing attention from indu...
research
05/10/2023

Inclusive FinTech Lending via Contrastive Learning and Domain Adaptation

FinTech lending (e.g., micro-lending) has played a significant role in f...
research
04/21/2022

Contrastive Test-Time Adaptation

Test-time adaptation is a special setting of unsupervised domain adaptat...
research
07/05/2021

Distance-based Hyperspherical Classification for Multi-source Open-Set Domain Adaptation

Vision systems trained in closed-world scenarios will inevitably fail wh...

Please sign up or login with your details

Forgot password? Click here to reset