Dimension Reduction for Robust Covariate Shift Correction

11/29/2017
by   Fulton Wang, et al.
0

In the covariate shift learning scenario, the training and test covariate distributions differ, so that a predictor's average loss over the training and test distributions also differ. The importance weighting approach handles this shift by minimizing an estimate of test loss over predictors, obtained via a weighted sum over training sample losses. However, as the dimension of the covariates increases, this test loss estimator increases in variance. In this work, we adapt the importance weighting approach to more robustly handle higher dimensional covariates by incorporating dimension reduction into the learning process.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/28/2017

Robust Covariate Shift Prediction with General Losses and Feature Views

Covariate shift relaxes the widely-employed independent and identically ...
research
10/18/2022

Importance Weighting Correction of Regularized Least-Squares for Covariate and Target Shifts

In many real world problems, the training data and test data have differ...
research
03/07/2023

When is Importance Weighting Correction Needed for Covariate Shift Adaptation?

This paper investigates when the importance weighting (IW) correction is...
research
10/17/2017

On reducing sampling variance in covariate shift using control variates

Covariate shift classification problems can in principle be tackled by i...
research
12/19/2021

Rethinking Importance Weighting for Transfer Learning

A key assumption in supervised learning is that training and test data f...
research
06/24/2014

Combining predictions from linear models when training and test inputs differ

Methods for combining predictions from different models in a supervised ...
research
04/12/2019

Conformal Prediction Under Covariate Shift

We extend conformal prediction methodology beyond the case of exchangeab...

Please sign up or login with your details

Forgot password? Click here to reset