Importance Weight Estimation and Generalization in Domain Adaptation under Label Shift

11/29/2020
by   Kamyar Azizzadenesheli, et al.
0

We study generalization under label shift in domain adaptation where the learner has access to labeled samples from the source domain but unlabeled samples from the target domain. Prior works deploy label classifiers and introduce various methods to estimate the importance weights from source to target domains. They use these estimates in importance weighted empirical risk minimization to learn classifiers. In this work, we theoretically compare the prior approaches, relax their strong assumptions, and generalize them from requiring label classifiers to general functions. This latter generalization improves the conditioning on the inverse operator of the induced inverse problems by allowing for broader exploitation of the spectrum of the forward operator. The prior works in the study of label shifts are limited to categorical label spaces. In this work, we propose a series of methods to estimate the importance weight functions for arbitrary normed label spaces. We introduce a new operator learning approach between Hilbert spaces defined on labels (rather than covariates) and show that it induces a perturbed inverse problem of compact operators. We propose a novel approach to solve the inverse problem in the presence of perturbation. This analysis has its own independent interest since such problems commonly arise in partial differential equations and reinforcement learning. For both categorical and general normed spaces, we provide concentration bounds for the proposed estimators. Using the existing generalization analysis based on Rademacher complexity, Rényi divergence, and MDFR lemma in Azizzadenesheli et al. [2019], we show the generalization property of the importance weighted empirical risk minimization on the unseen target domain.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/22/2019

Regularized Learning for Domain Adaptation under Label Shifts

We propose Regularized Learning under Label shifts (RLLS), a principled ...
research
02/22/2021

A Theory of Label Propagation for Subpopulation Shift

One of the central problems in machine learning is domain adaptation. Un...
research
04/20/2023

Noisy Universal Domain Adaptation via Divergence Optimization for Visual Recognition

To transfer the knowledge learned from a labeled source domain to an unl...
research
10/10/2020

Unveiling Class-Labeling Structure for Universal Domain Adaptation

As a more practical setting for unsupervised domain adaptation, Universa...
research
03/09/2022

Connecting sufficient conditions for domain adaptation: source-guided uncertainty, relaxed divergences and discrepancy localization

Recent advances in domain adaptation establish that requiring a low risk...
research
04/20/2022

Deep transfer learning for partial differential equations under conditional shift with DeepONet

Traditional machine learning algorithms are designed to learn in isolati...
research
09/19/2023

Mixture Weight Estimation and Model Prediction in Multi-source Multi-target Domain Adaptation

We consider the problem of learning a model from multiple heterogeneous ...

Please sign up or login with your details

Forgot password? Click here to reset