The Convexity and Design of Composite Multiclass Losses

06/18/2012
by   Mark Reid, et al.
0

We consider composite loss functions for multiclass prediction comprising a proper (i.e., Fisher-consistent) loss over probability distributions and an inverse link function. We establish conditions for their (strong) convexity and explore the implications. We also show how the separation of concerns afforded by using this composite representation allows for the design of families of losses with the same Bayes risk.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/17/2009

Composite Binary Losses

We study losses for binary classification and class probability estimati...
research
02/19/2019

Proper-Composite Loss Functions in Arbitrary Dimensions

The study of a machine learning problem is in many ways is difficult to ...
research
05/20/2018

Exp-Concavity of Proper Composite Losses

The goal of online prediction with expert advice is to find a decision s...
research
06/08/2020

All your loss are belong to Bayes

Loss functions are a cornerstone of machine learning and the starting po...
research
01/27/2023

LegendreTron: Uprising Proper Multiclass Loss Learning

Loss functions serve as the foundation of supervised learning and are of...
research
10/17/2019

Using Bayes Linear Emulators to Analyse Networks of Simulators

The key dynamics of processes within physical systems are often represen...
research
01/25/2023

On the inconsistency of separable losses for structured prediction

In this paper, we prove that separable negative log-likelihood losses fo...

Please sign up or login with your details

Forgot password? Click here to reset