Calibrating Predictions to Decisions: A Novel Approach to Multi-Class Calibration

07/12/2021
by   Shengjia Zhao, et al.
0

When facing uncertainty, decision-makers want predictions they can trust. A machine learning provider can convey confidence to decision-makers by guaranteeing their predictions are distribution calibrated – amongst the inputs that receive a predicted class probabilities vector q, the actual distribution over classes is q. For multi-class prediction problems, however, achieving distribution calibration tends to be infeasible, requiring sample complexity exponential in the number of classes C. In this work, we introduce a new notion – decision calibration – that requires the predicted distribution and true distribution to be “indistinguishable” to a set of downstream decision-makers. When all possible decision makers are under consideration, decision calibration is the same as distribution calibration. However, when we only consider decision makers choosing between a bounded number of actions (e.g. polynomial in C), our main result shows that decisions calibration becomes feasible – we design a recalibration algorithm that requires sample complexity polynomial in the number of actions and the number of classes. We validate our recalibration algorithm empirically: compared to existing methods, decision calibration improves decision-making on skin lesion and ImageNet classification with modern neural network predictors.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/02/2019

Measuring Calibration in Deep Learning

The reliability of a machine learning model's confidence in its predicti...
research
03/02/2022

Low-Degree Multicalibration

Introduced as a notion of algorithmic fairness, multicalibration has pro...
research
06/23/2020

Calibration of Neural Networks using Splines

Calibrating neural networks is of utmost importance when employing them ...
research
03/18/2022

Decision-Making under Miscalibration

ML-based predictions are used to inform consequential decisions about in...
research
12/20/2021

Classifier Calibration: How to assess and improve predicted class probabilities: a survey

This paper provides both an introduction to and a detailed overview of t...
research
02/03/2022

Hidden Heterogeneity: When to Choose Similarity-Based Calibration

Trustworthy classifiers are essential to the adoption of machine learnin...
research
07/16/2020

Predicting the Number of Future Events

This paper describes prediction methods for the number of future events ...

Please sign up or login with your details

Forgot password? Click here to reset