Detecting and Correcting for Label Shift with Black Box Predictors

02/12/2018
by   Zachary C Lipton, et al.
0

Faced with distribution shift between training and test set, we wish to detect and quantify the shift, and to correct our classifiers without test set labels. Motivated by medical diagnosis, where diseases (targets), cause symptoms (observations), we focus on label shift, where the label marginal p(y) changes but the conditional p(x|y) does not. We propose Black Box Shift Estimation (BBSE) to estimate the test distribution p(y). BBSE exploits arbitrary black box predictors to reduce dimensionality prior to shift correction. While better predictors give tighter estimates, BBSE works even when predictors are biased, inaccurate, or uncalibrated, so long as their confusion matrices are invertible. We prove BBSE's consistency, bound its error, and introduce a statistical test that uses BBSE to detect shift. We also leverage BBSE to correct classifiers. Experiments demonstrate accurate estimates and improved prediction, even on high-dimensional datasets of natural images

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/26/2022

Domain Adaptation under Open Set Label Shift

We introduce the problem of domain adaptation under Open Set Label Shift...
research
07/02/2022

Test-time Adaptation with Calibration of Medical Image Classification Nets for Label Distribution Shift

Class distribution plays an important role in learning deep classifiers....
research
10/31/2019

A study of data and label shift in the LIME framework

LIME is a popular approach for explaining a black-box prediction through...
research
02/18/2021

Optimizing Black-box Metrics with Iterative Example Weighting

We consider learning to optimize a classification metric defined by a bl...
research
01/21/2019

Calibration with Bias-Corrected Temperature Scaling Improves Domain Adaptation Under Label Shift in Modern Neural Networks

Label shift refers to the phenomenon where the marginal probability p(y)...
research
10/09/2022

Test-time Recalibration of Conformal Predictors Under Distribution Shift Based on Unlabeled Examples

Modern image classifiers achieve high predictive accuracy, but the predi...
research
06/01/2021

Adaptive Conformal Inference Under Distribution Shift

We develop methods for forming prediction sets in an online setting wher...

Please sign up or login with your details

Forgot password? Click here to reset