DeepAI AI Chat
Log In Sign Up

Mind the Gap: Enlarging the Domain Gap in Open Set Domain Adaptation

03/08/2020
by   Dongliang Chang, et al.
0

Unsupervised domain adaptation aims to leverage labeled data from a source domain to learn a classifier for an unlabeled target domain. Among its many variants, open set domain adaptation (OSDA) is perhaps the most challenging, as it further assumes the presence of unknown classes in the target domain. In this paper, we study OSDA with a particular focus on enriching its ability to traverse across larger domain gaps. Firstly, we show that existing state-of-the-art methods suffer a considerable performance drop in the presence of larger domain gaps, especially on a new dataset (PACS) that we re-purposed for OSDA. We then propose a novel framework to specifically address the larger domain gaps. The key insight lies with how we exploit the mutually beneficial information between two networks; (a) to separate samples of known and unknown classes, (b) to maximize the domain confusion between source and target domain without the influence of unknown samples. It follows that (a) and (b) will mutually supervise each other and alternate until convergence. Extensive experiments are conducted on Office-31, Office-Home, and PACS datasets, demonstrating the superiority of our method in comparison to other state-of-the-arts. Code has been provided as part of supplementary material and will be publicly released upon acceptance.

READ FULL TEXT
07/19/2019

Open Set Domain Adaptation: Theoretical Bound and Algorithm

Unsupervised domain adaptation for classification tasks has achieved gre...
07/24/2020

On the Effectiveness of Image Rotation for Open Set Domain Adaptation

Open Set Domain Adaptation (OSDA) bridges the domain gap between a label...
12/16/2021

UMAD: Universal Model Adaptation under Domain and Category Shift

Learning to reject unknown samples (not present in the source classes) i...
04/29/2022

Controlled Generation of Unseen Faults for Partial and OpenSet Partial Domain Adaptation

New operating conditions can result in a performance drop of fault diagn...
11/08/2022

Unsupervised Domain Adaptation for Sparse Retrieval by Filling Vocabulary and Word Frequency Gaps

IR models using a pretrained language model significantly outperform lex...
04/16/2022

Safe Self-Refinement for Transformer-based Domain Adaptation

Unsupervised Domain Adaptation (UDA) aims to leverage a label-rich sourc...
06/11/2020

Exploring Category-Agnostic Clusters for Open-Set Domain Adaptation

Unsupervised domain adaptation has received significant attention in rec...