On the Consistency of Top-k Surrogate Losses

01/30/2019
by   Forest Yang, et al.
0

The top-k error is often employed to evaluate performance for challenging classification tasks in computer vision as it is designed to compensate for ambiguity in ground truth labels. This practical success motivates our theoretical analysis of consistent top-k classification. To this end, we define top-k calibration as a necessary and sufficient condition for consistency, for bounded below loss functions. Unlike prior work, our analysis of top-k calibration handles non-uniqueness of the predictor scores, and extends calibration to consistency -- providing a theoretically sound basis for analysis of this topic. Based on the top-k calibration analysis, we propose a rich class of top-k calibrated Bregman divergence surrogates. Our analysis continues by showing previously proposed hinge-like top-k surrogate losses are not top-k calibrated and thus inconsistent. On the other hand, we propose two new hinge-like losses, one which is similarly inconsistent, and one which is consistent. Our empirical results highlight theoretical claims, confirming our analysis of the consistency of these losses.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/19/2021

Calibration and Consistency of Adversarial Surrogate Losses

Adversarial robustness is an increasingly critical property of classifie...
research
02/14/2023

On Classification-Calibration of Gamma-Phi Losses

Gamma-Phi losses constitute a family of multiclass classification loss f...
research
06/27/2022

RankSEG: A Consistent Ranking-based Framework for Segmentation

Segmentation has emerged as a fundamental field of computer vision and n...
research
05/20/2022

Towards Consistency in Adversarial Classification

In this paper, we study the problem of consistency in the context of adv...
research
05/30/2023

When Does Optimizing a Proper Loss Yield Calibration?

Optimizing proper loss functions is popularly believed to yield predicto...
research
10/30/2022

Learning to Defer to Multiple Experts: Consistent Surrogate Losses, Confidence Calibration, and Conformal Ensembles

We study the statistical properties of learning to defer (L2D) to multip...
research
09/20/2016

Multiclass Classification Calibration Functions

In this paper we refine the process of computing calibration functions f...

Please sign up or login with your details

Forgot password? Click here to reset