Contrastive Learning and Self-Training for Unsupervised Domain Adaptation in Semantic Segmentation

05/05/2021
by   Robert A. Marsden, et al.
0

Deep convolutional neural networks have considerably improved state-of-the-art results for semantic segmentation. Nevertheless, even modern architectures lack the ability to generalize well to a test dataset that originates from a different domain. To avoid the costly annotation of training data for unseen domains, unsupervised domain adaptation (UDA) attempts to provide efficient knowledge transfer from a labeled source domain to an unlabeled target domain. Previous work has mainly focused on minimizing the discrepancy between the two domains by using adversarial training or self-training. While adversarial training may fail to align the correct semantic categories as it minimizes the discrepancy between the global distributions, self-training raises the question of how to provide reliable pseudo-labels. To align the correct semantic categories across domains, we propose a contrastive learning approach that adapts category-wise centroids across domains. Furthermore, we extend our method with self-training, where we use a memory-efficient temporal ensemble to generate consistent and reliable pseudo-labels. Although both contrastive learning and self-training (CLST) through temporal ensembling enable knowledge transfer between two domains, it is their combination that leads to a symbiotic structure. We validate our approach on two domain adaptation benchmarks: GTA5 → Cityscapes and SYNTHIA → Cityscapes. Our method achieves better or comparable results than the state-of-the-art. We will make the code publicly available.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/27/2022

CLUDA : Contrastive Learning in Unsupervised Domain Adaptation for Semantic Segmentation

In this work, we propose CLUDA, a simple, yet novel method for performin...
research
05/27/2021

Unsupervised Adaptive Semantic Segmentation with Local Lipschitz Constraint

Recent advances in unsupervised domain adaptation have seen considerable...
research
01/26/2021

Prototypical Pseudo Label Denoising and Target Structure Learning for Domain Adaptive Semantic Segmentation

Self-training is a competitive approach in domain adaptive segmentation,...
research
07/23/2021

Unsupervised Domain Adaptation for Video Semantic Segmentation

Unsupervised Domain Adaptation for semantic segmentation has gained imme...
research
11/26/2021

Contrastive Vicinal Space for Unsupervised Domain Adaptation

Utilizing vicinal space between the source and target domains is one of ...
research
07/29/2019

Regularizing Proxies with Multi-Adversarial Training for Unsupervised Domain-Adaptive Semantic Segmentation

Training a semantic segmentation model requires a large amount of pixel-...
research
04/29/2023

Regularizing Self-training for Unsupervised Domain Adaptation via Structural Constraints

Self-training based on pseudo-labels has emerged as a dominant approach ...

Please sign up or login with your details

Forgot password? Click here to reset