Refinement revisited with connections to Bayes error, conditional entropy and calibrated classifiers

03/11/2013
by   Hamed Masnadi-Shirazi, et al.
0

The concept of refinement from probability elicitation is considered for proper scoring rules. Taking directions from the axioms of probability, refinement is further clarified using a Hilbert space interpretation and reformulated into the underlying data distribution setting where connections to maximal marginal diversity and conditional entropy are considered and used to derive measures that provide arbitrarily tight bounds on the Bayes error. Refinement is also reformulated into the classifier output setting and its connections to calibrated classifiers and proper margin losses are established.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/19/2018

Properization: Constructing Proper Scoring Rules via Bayes Acts

Scoring rules serve to quantify predictive performance. A scoring rule i...
research
01/28/2022

On the Kolmogorov Complexity of Binary Classifiers

We provide tight upper and lower bounds on the expected minimum Kolmogor...
research
05/17/2020

On loss functions and regret bounds for multi-category classification

We develop new approaches in multi-class settings for constructing prope...
research
01/09/2018

Generalized Fano-Type Inequality for Countably Infinite Systems with List-Decoding

This study investigates generalized Fano-type inequalities in the follow...
research
12/12/2011

Threshold Choice Methods: the Missing Link

Many performance metrics have been introduced for the evaluation of clas...
research
12/08/2022

Structure of Classifier Boundaries: Case Study for a Naive Bayes Classifier

Whether based on models, training data or a combination, classifiers pla...

Please sign up or login with your details

Forgot password? Click here to reset