Connecting sufficient conditions for domain adaptation: source-guided uncertainty, relaxed divergences and discrepancy localization

03/09/2022
by   Sofien Dhouib, et al.
0

Recent advances in domain adaptation establish that requiring a low risk on the source domain and equal feature marginals degrade the adaptation's performance. At the same time, empirical evidence shows that incorporating an unsupervised target domain term that pushes decision boundaries away from the high-density regions, along with relaxed alignment, improves adaptation. In this paper, we theoretically justify such observations via a new bound on the target risk, and we connect two notions of relaxation for divergence, namely β-relaxed divergences and localization. This connection allows us to incorporate the source domain's categorical structure into the relaxation of the considered divergence, provably resulting in a better handling of the label shift case in particular.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset