Model-free posterior inference on the area under the receiver operating characteristic curve

06/19/2019
by   Zhe Wang, et al.
0

The area under the receiver operating characteristic curve (AUC) serves as a summary of a binary classifier's performance. Methods for estimating the AUC have been developed under a binormality assumption which restricts the distribution of the score produced by the classifier. However, this assumption introduces an infinite-dimensional nuisance parameter and can be inappropriate, especially in the context of machine learning. This motivates us to adopt a model-free Gibbs posterior distribution for the AUC. We present the asymptotic Gibbs posterior concentration rate, and a strategy for tuning the learning rate so that the corresponding credible intervals achieve the nominal frequentist coverage probability. Simulation experiments and a real data analysis demonstrate the Gibbs posterior's strong performance compared to existing methods based on a rank likelihood.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/03/2020

Gibbs posterior inference on multivariate quantiles

Bayesian and other likelihood-based methods require specification of a s...
research
03/24/2022

Resilience family of receiver operating characteristic curves

A new semiparametric model of the ROC curve based on the resilience fami...
research
08/10/2021

Robust posterior inference for Youden's index cutoff

Youden's index cutoff is a classifier mapping a patient's diagnostic tes...
research
03/17/2022

Direct Gibbs posterior inference on risk minimizers: construction, concentration, and calibration

Real-world problems, often couched as machine learning applications, inv...
research
03/18/2018

Bayesian ROC surface estimation under verification bias

The Receiver Operating Characteristic (ROC) surface is a generalization ...
research
01/28/2021

The fraud loss for selecting the model complexity in fraud detection

In fraud detection applications, the investigator is typically limited t...
research
05/21/2021

Computational Efficient Approximations of the Concordance Probability in a Big Data Setting

Performance measurement is an essential task once a statistical model is...

Please sign up or login with your details

Forgot password? Click here to reset