ℋ-Consistency Estimation Error of Surrogate Loss Minimizers

05/16/2022
by   Pranjal Awasthi, et al.
5

We present a detailed study of estimation errors in terms of surrogate loss estimation errors. We refer to such guarantees as ℋ-consistency estimation error bounds, since they account for the hypothesis set ℋ adopted. These guarantees are significantly stronger than ℋ-calibration or ℋ-consistency. They are also more informative than similar excess error bounds derived in the literature, when ℋ is the family of all measurable functions. We prove general theorems providing such guarantees, for both the distribution-dependent and distribution-independent settings. We show that our bounds are tight, modulo a convexity assumption. We also show that previous excess error bounds can be recovered as special cases of our general results. We then present a series of explicit bounds in the case of the zero-one loss, with multiple choices of the surrogate loss and for both the family of linear functions and neural networks with one hidden-layer. We further prove more favorable distribution-dependent guarantees in that case. We also present a series of explicit bounds in the case of the adversarial loss, with surrogate losses based on the supremum of the ρ-margin, hinge or sigmoid loss and for the same two general hypothesis sets. Here too, we prove several enhancements of these guarantees under natural distributional assumptions. Finally, we report the results of simulations illustrating our bounds and their tightness.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/05/2023

Ranking with Abstention

We introduce a novel framework of ranking with abstention, where the lea...
research
06/26/2020

Relative Deviation Margin Bounds

We present a series of new and more favorable margin-based learning guar...
research
04/19/2021

Calibration and Consistency of Adversarial Surrogate Losses

Adversarial robustness is an increasingly critical property of classifie...
research
04/14/2023

Cross-Entropy Loss Functions: Theoretical Analysis and Applications

Cross-entropy is a widely used loss function in applications. It coincid...
research
05/20/2016

Structured Prediction Theory Based on Factor Graph Complexity

We present a general theoretical analysis of structured prediction with ...
research
09/20/2016

Multiclass Classification Calibration Functions

In this paper we refine the process of computing calibration functions f...
research
05/13/2013

Boosting with the Logistic Loss is Consistent

This manuscript provides optimization guarantees, generalization bounds,...

Please sign up or login with your details

Forgot password? Click here to reset