Sequential Unsupervised Domain Adaptation through Prototypical Distributions

07/01/2020
by   Mohammad Rostami, et al.
19

We develop an algorithm for unsupervised domain adaptation (UDA) of a classifier from a labeled source domain to an unlabeled target domain in a sequential learning setting. UDA has been studied extensively recently but the vast majority of the existing methods consider a joint learning setting where the model is trained on the source domain and the target domain data simultaneously. We consider a more practical setting, where the model has been trained on the labeled source domain data and then needs to be adapted to the unlabeled source domain, without having access to the source domain training data. We tackle this problem by aligning the distributions of the source and the target domain in a discriminative embedding space. To overcome the challenges of learning in a sequential setting, we learn an intermediate prototypical distribution from the source labeled data and then use this distribution for knowledge transfer to the target domain. We provide theoretical justification for the proposed algorithm by showing that it optimizes an upper-bound for the expected risk in the target domain. We also conduct extensive experiments with several standard benchmarks and demonstrate the competitiveness of the proposed method compared to existing joint learning UDA algorithms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset