Continual Unsupervised Domain Adaptation for Semantic Segmentation using a Class-Specific Transfer

08/12/2022
by   Robert A. Marsden, et al.
0

In recent years, there has been tremendous progress in the field of semantic segmentation. However, one remaining challenging problem is that segmentation models do not generalize to unseen domains. To overcome this problem, one either has to label lots of data covering the whole variety of domains, which is often infeasible in practice, or apply unsupervised domain adaptation (UDA), only requiring labeled source data. In this work, we focus on UDA and additionally address the case of adapting not only to a single domain, but to a sequence of target domains. This requires mechanisms preventing the model from forgetting its previously learned knowledge. To adapt a segmentation model to a target domain, we follow the idea of utilizing light-weight style transfer to convert the style of labeled source images into the style of the target domain, while retaining the source content. To mitigate the distributional shift between the source and the target domain, the model is fine-tuned on the transferred source images in a second step. Existing light-weight style transfer approaches relying on adaptive instance normalization (AdaIN) or Fourier transformation still lack performance and do not substantially improve upon common data augmentation, such as color jittering. The reason for this is that these methods do not focus on region- or class-specific differences, but mainly capture the most salient style. Therefore, we propose a simple and light-weight framework that incorporates two class-conditional AdaIN layers. To extract the class-specific target moments needed for the transfer layers, we use unfiltered pseudo-labels, which we show to be an effective approximation compared to real labels. We extensively validate our approach (CACE) on a synthetic sequence and further propose a challenging sequence consisting of real domains. CACE outperforms existing methods visually and quantitatively.

READ FULL TEXT

page 1

page 3

research
12/09/2021

Style Mixing and Patchwise Prototypical Matching for One-Shot Unsupervised Domain Adaptive Semantic Segmentation

In this paper, we tackle the problem of one-shot unsupervised domain ada...
research
04/25/2022

ProCST: Boosting Semantic Segmentation using Progressive Cyclic Style-Transfer

Using synthetic data for training neural networks that achieve good perf...
research
03/29/2021

Get away from Style: Category-Guided Domain Adaptation for Semantic Segmentation

Unsupervised domain adaptation (UDA) becomes more and more popular in ta...
research
05/31/2021

Closer Look at the Uncertainty Estimation in Semantic Segmentation under Distributional Shift

While recent computer vision algorithms achieve impressive performance o...
research
07/22/2022

Prototype-Guided Continual Adaptation for Class-Incremental Unsupervised Domain Adaptation

This paper studies a new, practical but challenging problem, called Clas...
research
01/18/2022

Continual Coarse-to-Fine Domain Adaptation in Semantic Segmentation

Deep neural networks are typically trained in a single shot for a specif...

Please sign up or login with your details

Forgot password? Click here to reset