Simplified Neural Unsupervised Domain Adaptation

05/22/2019
by   Timothy A Miller, et al.
0

Unsupervised domain adaptation (UDA) is the task of modifying a statistical model trained on labeled data from a source domain to achieve better performance on data from a target domain, with access to only unlabeled data in the target domain. Existing state-of-the-art UDA approaches use neural networks to learn representations that can predict the values of subset of important features called "pivot features." In this work, we show that it is possible to improve on these methods by jointly training the representation learner with the task learner, and examine the importance of existing pivot selection methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/19/2019

Butterfly: Robust One-step Approach towards Wildly-unsupervised Domain Adaptation

Unsupervised domain adaptation (UDA) trains with clean labeled data in s...
research
10/23/2021

Domain Adaptation via Maximizing Surrogate Mutual Information

Unsupervised domain adaptation (UDA), which is an important topic in tra...
research
11/20/2017

Parameter Reference Loss for Unsupervised Domain Adaptation

The success of deep learning in computer vision is mainly attributed to ...
research
05/02/2023

Addressing Parameter Choice Issues in Unsupervised Domain Adaptation by Aggregation

We study the problem of choosing algorithm hyper-parameters in unsupervi...
research
09/11/2023

Feature-based Transferable Disruption Prediction for future tokamaks using domain adaptation

The high acquisition cost and the significant demand for disruptive disc...
research
02/19/2020

Enlarging Discriminative Power by Adding an Extra Class in Unsupervised Domain Adaptation

In this paper, we study the problem of unsupervised domain adaptation th...
research
09/26/2014

Unsupervised Domain Adaptation by Backpropagation

Top-performing deep architectures are trained on massive amounts of labe...

Please sign up or login with your details

Forgot password? Click here to reset