A One-step Approach to Covariate Shift Adaptation

07/08/2020
by   Tianyi Zhang, et al.
0

A default assumption in many machine learning scenarios is that the training and test samples are drawn from the same probability distribution. However, such an assumption is often violated in the real world due to non-stationarity of the environment or bias in sample selection. In this work, we consider a prevalent setting called covariate shift, where the input distribution differs between the training and test stages while the conditional distribution of the output given the input remains unchanged. Most of the existing methods for covariate shift adaptation are two-step approaches, which first calculate the importance weights and then conduct importance-weighted empirical risk minimization. In this paper, we propose a novel one-step approach that jointly learns the predictive model and the associated weights in one optimization by minimizing an upper bound of the test risk. We theoretically analyze the proposed method and provide a generalization error bound. We also empirically demonstrate the effectiveness of the proposed method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/19/2023

Information Geometrically Generalized Covariate Shift Adaptation

Many machine learning methods assume that the training and test data fol...
research
02/06/2023

Adapting to Continuous Covariate Shift via Online Density Ratio Estimation

Dealing with distribution shifts is one of the central challenges for mo...
research
11/19/2021

Maximum Mean Discrepancy for Generalization in the Presence of Distribution and Missingness Shift

Covariate shifts are a common problem in predictive modeling on real-wor...
research
07/21/2022

JAWS: Predictive Inference Under Covariate Shift

We propose JAWS, a series of wrapper methods for distribution-free uncer...
research
09/19/2022

UMIX: Improving Importance Weighting for Subpopulation Shift via Uncertainty-Aware Mixup

Subpopulation shift wildly exists in many real-world machine learning ap...
research
09/05/2022

Learning from a Biased Sample

The empirical risk minimization approach to data-driven decision making ...
research
12/19/2021

Rethinking Importance Weighting for Transfer Learning

A key assumption in supervised learning is that training and test data f...

Please sign up or login with your details

Forgot password? Click here to reset