Importance Weighting Correction of Regularized Least-Squares for Covariate and Target Shifts

by   Davit Gogolashvili, et al.

In many real world problems, the training data and test data have different distributions. This situation is commonly referred as a dataset shift. The most common settings for dataset shift often considered in the literature are covariate shift and target shift. Importance weighting (IW) correction is a universal method for correcting the bias present in learning scenarios under dataset shift. The question one may ask is: does IW correction work equally well for different dataset shift scenarios? By investigating the generalization properties of the weighted kernel ridge regression (W-KRR) under covariate and target shifts we show that the answer is negative, except when IW is bounded and the model is wellspecified. In the latter cases, a minimax optimal rates are achieved by importance weighted kernel ridge regression (IW-KRR) in both, covariate and target shift scenarios. Slightly relaxing the boundedness condition of the IW we show that the IW-KRR still achieves the optimal rates under target shift while leading to slower rates for covariate shift. In the case of the model misspecification we show that the performance of the W-KRR under covariate shift could be substantially increased by designing an alternative reweighting function. The distinction between misspecified and wellspecified scenarios does not seem to be crucial in the learning problems under target shift.


page 1

page 2

page 3

page 4


When is Importance Weighting Correction Needed for Covariate Shift Adaptation?

This paper investigates when the importance weighting (IW) correction is...

Pseudo-Labeling for Kernel Ridge Regression under Covariate Shift

We develop and analyze a principled approach to kernel ridge regression ...

Dimension Reduction for Robust Covariate Shift Correction

In the covariate shift learning scenario, the training and test covariat...

Stratified Learning: a general-purpose statistical method for improved learning under Covariate Shift

Covariate shift arises when the labelled training (source) data is not r...

Intractable Likelihood Regression for Covariate Shift by Kernel Mean Embedding

Simulation plays an essential role in comprehending a target system in m...

Semiparametric correction for endogenous truncation bias with Vox Populi based participation decision

We synthesize the knowledge present in various scientific disciplines fo...

Robust Importance Weighting for Covariate Shift

In many learning problems, the training and testing data follow differen...

Please sign up or login with your details

Forgot password? Click here to reset