Minimising quantifier variance under prior probability shift

07/17/2021
by   Dirk Tasche, et al.
0

For the binary prevalence quantification problem under prior probability shift, we determine the asymptotic variance of the maximum likelihood estimator. We find that it is a function of the Brier score for the regression of the class label against the features under the test data set distribution. This observation suggests that optimising the accuracy of a base classifier on the training data set helps to reduce the variance of the related quantifier on the test data set. Therefore, we also point out training criteria for the base classifier that imply optimisation of both of the Brier scores on the training and the test data sets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/31/2020

Stable Prediction with Model Misspecification and Agnostic Distribution Shift

For many machine learning algorithms, two main assumptions are required ...
research
07/11/2021

Positive-Unlabeled Classification under Class-Prior Shift: A Prior-invariant Approach Based on Density Ratio Estimation

Learning from positive and unlabeled (PU) data is an important problem i...
research
10/10/2018

Intrusion Detection Using Mouse Dynamics

Compared to other behavioural biometrics, mouse dynamics is a less explo...
research
10/17/2017

On reducing sampling variance in covariate shift using control variates

Covariate shift classification problems can in principle be tackled by i...
research
01/15/2020

Generalized Bayesian Quantification Learning

Quantification Learning is the task of prevalence estimation for a test ...
research
09/18/2020

Sequential changepoint detection for label shift in classification

Classifier predictions often rely on the assumption that new observation...
research
07/12/2022

Application of Benford-Newcomb Law with Base Change to Electoral Fraud Detection

The invariance of Benford-Newcomb law under base changing is employed to...

Please sign up or login with your details

Forgot password? Click here to reset