Doubly Robust Calibration of Prediction Sets under Covariate Shift

03/03/2022
by   Yachong Yang, et al.
0

Conformal prediction has received tremendous attention in recent years and has offered new solutions to problems in missing data and causal inference; yet these advances have not leveraged modern semiparametric efficiency theory for more robust and efficient uncertainty quantification. In this paper, we consider the problem of obtaining distribution-free prediction regions accounting for a shift in the distribution of the covariates between the training and test data. Under an explainable covariate shift assumption analogous to the standard missing at random assumption, we propose three variants of a general framework to construct well-calibrated prediction regions for the unobserved outcome in the test sample. Our approach is based on the efficient influence function for the quantile of the unobserved outcome in the test population combined with an arbitrary machine learning prediction algorithm, without compromising asymptotic coverage. Next, we extend our approach to account for departure from the explainable covariate shift assumption in a semiparametric sensitivity analysis for potential latent covariate shift. In all cases, we establish that the resulting prediction sets eventually attain nominal average coverage in large samples. This guarantee is a consequence of the product bias form of our proposal which implies correct coverage if either the propensity score or the conditional distribution of the response is estimated sufficiently well. Our results also provide a framework for construction of doubly robust prediction sets of individual treatment effects, under both unconfoundedness and allowing for some degree of unmeasured confounding. Finally, we discuss aggregation of prediction sets from different machine learning algorithms for optimal prediction and illustrate the performance of our methods in both synthetic and real data.

READ FULL TEXT

page 16

page 35

page 36

research
10/14/2020

A Distribution-Free Test of Covariate Shift Using Conformal Prediction

Covariate shift is a common and important assumption in transfer learnin...
research
04/12/2019

Conformal Prediction Under Covariate Shift

We extend conformal prediction methodology beyond the case of exchangeab...
research
11/23/2021

Sensitivity Analysis of Individual Treatment Effects: A Robust Conformal Inference Approach

We propose a model-free framework for sensitivity analysis of individual...
research
03/11/2022

Distribution-free Prediction Sets Adaptive to Unknown Covariate Shift

Predicting sets of outcomes – instead of unique outcomes – is a promisin...
research
07/21/2022

JAWS: Predictive Inference Under Covariate Shift

We propose JAWS, a series of wrapper methods for distribution-free uncer...
research
07/18/2023

Model-free selective inference under covariate shift via weighted conformal p-values

This paper introduces weighted conformal p-values for model-free selecti...
research
02/08/2023

Prediction approaches for partly missing multi-omics covariate data: A literature review and an empirical comparison study

As the availability of omics data has increased in the last few years, m...

Please sign up or login with your details

Forgot password? Click here to reset