Cycle Self-Training for Domain Adaptation

03/05/2021
by   Hong Liu, et al.
0

Mainstream approaches for unsupervised domain adaptation (UDA) learn domain-invariant representations to bridge domain gap. More recently, self-training has been gaining momentum in UDA. Originated from semi-supervised learning, self-training uses unlabeled data efficiently by training on pseudo-labels. However, as corroborated in this work, under distributional shift in UDA, the pseudo-labels can be unreliable in terms of their large discrepancy from the ground truth labels. Thereby, we propose Cycle Self-Training (CST), a principled self-training algorithm that enforces pseudo-labels to generalize across domains. In the forward step, CST generates target pseudo-labels with a source-trained classifier. In the reverse step, CST trains a target classifier using target pseudo-labels, and then updates the shared representations to make the target classifier perform well on the source data. We introduce the Tsallis entropy, a novel regularization to improve the quality of target pseudo-labels. On quadratic neural networks, we prove that CST recovers target ground truth, while both invariant feature learning and vanilla self-training fail. Empirical results indicate that CST significantly improves over prior state-of-the-arts in standard UDA benchmarks across visual recognition and sentiment analysis tasks.

READ FULL TEXT
research
07/17/2019

Multi-Purposing Domain Adaptation Discriminators for Pseudo Labeling Confidence

Often domain adaptation is performed using a discriminator (domain class...
research
07/31/2023

Domain Adaptation for Medical Image Segmentation using Transformation-Invariant Self-Training

Models capable of leveraging unlabelled data are crucial in overcoming l...
research
03/08/2020

Pseudo Labeling and Negative Feedback Learning for Large-scale Multi-label Domain Classification

In large-scale domain classification, an utterance can be handled by mul...
research
10/07/2020

Theoretical Analysis of Self-Training with Deep Networks on Unlabeled Data

Self-training algorithms, which train a model to fit pseudolabels predic...
research
07/08/2020

Combating Domain Shift with Self-Taught Labeling

We present a novel method to combat domain shift when adapting classific...
research
11/23/2021

A self-training framework for glaucoma grading in OCT B-scans

In this paper, we present a self-training-based framework for glaucoma g...
research
12/01/2020

Sim2Real for Self-Supervised Monocular Depth and Segmentation

Image-based learning methods for autonomous vehicle perception tasks req...

Please sign up or login with your details

Forgot password? Click here to reset