Learning Classifiers with Fenchel-Young Losses: Generalized Entropies, Margins, and Algorithms

05/24/2018
by   Mathieu Blondel, et al.
0

We study in this paper Fenchel-Young losses, a generic way to construct convex loss functions from a convex regularizer. We provide an in-depth study of their properties in a broad setting and show that they unify many well-known loss functions. When constructed from a generalized entropy, which includes well-known entropies such as Shannon and Tsallis entropies, we show that Fenchel-Young losses induce a predictive probability distribution and develop an efficient algorithm to compute that distribution for separable entropies. We derive conditions for generalized entropies to yield a distribution with sparse support and losses with a separation margin. Finally, we present both primal and dual algorithms to learn predictive models with generic Fenchel-Young losses.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/08/2019

Learning with Fenchel-Young Losses

Over the past decades, numerous loss functions have been been proposed f...
research
01/25/2023

On the inconsistency of separable losses for structured prediction

In this paper, we prove that separable negative log-likelihood losses fo...
research
03/13/2023

General Loss Functions Lead to (Approximate) Interpolation in High Dimensions

We provide a unified framework, applicable to a general family of convex...
research
12/28/2016

The Pessimistic Limits of Margin-based Losses in Semi-supervised Learning

We show that for linear classifiers defined by convex margin-based surro...
research
10/16/2022

Loss Minimization through the Lens of Outcome Indistinguishability

We present a new perspective on loss minimization and the recent notion ...
research
10/15/2021

Robustness of different loss functions and their impact on networks learning capability

Recent developments in AI have made it ubiquitous, every industry is try...
research
03/04/2021

Remember What You Want to Forget: Algorithms for Machine Unlearning

We study the problem of forgetting datapoints from a learnt model. In th...

Please sign up or login with your details

Forgot password? Click here to reset