Semi-Supervised Domain Adaptation with Prototypical Alignment and Consistency Learning

by   Kai Li, et al.

Domain adaptation enhances generalizability of a model across domains with domain shifts. Most research effort has been spent on Unsupervised Domain Adaption (UDA) which trains a model jointly with labeled source data and unlabeled target data. This paper studies how much it can help address domain shifts if we further have a few target samples (e.g., one sample per class) labeled. This is the so-called semi-supervised domain adaptation (SSDA) problem and the few labeled target samples are termed as “landmarks”. To explore the full potential of landmarks, we incorporate a prototypical alignment (PA) module which calculates a target prototype for each class from the landmarks; source samples are then aligned with the target prototype from the same class. To further alleviate label scarcity, we propose a data augmentation based solution. Specifically, we severely perturb the labeled images, making PA non-trivial to achieve and thus promoting model generalizability. Moreover, we apply consistency learning on unlabeled target images, by perturbing each image with light transformations and strong transformations. Then, the strongly perturbed image can enjoy “supervised-like” training using the pseudo label inferred from the lightly perturbed one. Experiments show that the proposed method, though simple, reaches significant performance gains over state-of-the-art methods, and enjoys the flexibility of being able to serve as a plug-and-play component to various existing UDA methods and improve adaptation performance with landmarks provided. Our code is available at <>.


Prototype-Guided Continual Adaptation for Class-Incremental Unsupervised Domain Adaptation

This paper studies a new, practical but challenging problem, called Clas...

Multi-Source Domain Adaptation and Semi-Supervised Domain Adaptation with Focus on Visual Domain Adaptation Challenge 2019

This notebook paper presents an overview and comparative analysis of our...

ACT: Semi-supervised Domain-adaptive Medical Image Segmentation with Asymmetric Co-training

Unsupervised domain adaptation (UDA) has been vastly explored to allevia...

Semi-supervised Models are Strong Unsupervised Domain Adaptation Learners

Unsupervised domain adaptation (UDA) and semi-supervised learning (SSL) ...

Learning Invariant Representation with Consistency and Diversity for Semi-supervised Source Hypothesis Transfer

Semi-supervised domain adaptation (SSDA) aims to solve tasks in target d...

Semi-Supervised Domain Adaptation via Selective Pseudo Labeling and Progressive Self-Training

Domain adaptation (DA) is a representation learning methodology that tra...

Randomized Histogram Matching: A Simple Augmentation for Unsupervised Domain Adaptation in Overhead Imagery

Modern deep neural networks (DNNs) achieve highly accurate results for m...