On the inconsistency of separable losses for structured prediction

01/25/2023
by   Caio Corro, et al.
0

In this paper, we prove that separable negative log-likelihood losses for structured prediction are not necessarily Bayes consistent, or, in other words, minimizing these losses may not result in a model that predicts the most probable structure in the data distribution for a given input. This fact opens the question of whether these losses are well-adapted for structured prediction and, if so, why.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/24/2018

Learning Classifiers with Fenchel-Young Losses: Generalized Entropies, Margins, and Algorithms

We study in this paper Fenchel-Young losses, a generic way to construct ...
research
06/22/2022

Diagnostic Tool for Out-of-Sample Model Evaluation

Assessment of model fitness is an important step in many problems. Model...
research
06/14/2017

SEARNN: Training RNNs with Global-Local Losses

We propose SEARNN, a novel training algorithm for recurrent neural netwo...
research
06/27/2012

Consistent Multilabel Ranking through Univariate Losses

We consider the problem of rank loss minimization in the setting of mult...
research
11/21/2018

Marginal Weighted Maximum Log-likelihood for Efficient Learning of Perturb-and-Map models

We consider the structured-output prediction problem through probabilist...
research
06/18/2012

The Convexity and Design of Composite Multiclass Losses

We consider composite loss functions for multiclass prediction comprisin...
research
11/14/2017

Classical Structured Prediction Losses for Sequence to Sequence Learning

There has been much recent work on training neural attention models at t...

Please sign up or login with your details

Forgot password? Click here to reset