Unifying Lower Bounds on Prediction Dimension of Consistent Convex Surrogates

02/16/2021
by   Jessie Finocchiaro, et al.
0

Given a prediction task, understanding when one can and cannot design a consistent convex surrogate loss, particularly a low-dimensional one, is an important and active area of machine learning research. The prediction task may be given as a target loss, as in classification and structured prediction, or simply as a (conditional) statistic of the data, as in risk measure estimation. These two scenarios typically involve different techniques for designing and analyzing surrogate losses. We unify these settings using tools from property elicitation, and give a general lower bound on prediction dimension. Our lower bound tightens existing results in the case of discrete predictions, showing that previous calibration-based bounds can largely be recovered via property elicitation. For continuous estimation, our lower bound resolves on open problem on estimating measures of risk and uncertainty.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

03/07/2017

On Structured Prediction Theory with Calibrated Convex Surrogate Losses

We provide novel theoretical insights on structured prediction in the co...
10/26/2021

Surrogate Regret Bounds for Polyhedral Losses

Surrogate risk minimization is an ubiquitous paradigm in supervised mach...
01/26/2018

Adaptive Lower Bound for Testing Monotonicity on the Line

In the property testing model, the task is to distinguish objects posses...
04/22/2018

A constrained risk inequality for general losses

We provide a general constrained risk inequality that applies to arbitra...
05/15/2015

Consistent Algorithms for Multiclass Classification with a Reject Option

We consider the problem of n-class classification (n≥ 2), where the clas...
05/12/2022

Orthogonal Gromov-Wasserstein Discrepancy with Efficient Lower Bound

Comparing structured data from possibly different metric-measure spaces ...
12/07/2020

Stronger Calibration Lower Bounds via Sidestepping

We consider an online binary prediction setting where a forecaster obser...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.