Sharp Analysis of Learning with Discrete Losses

10/16/2018
by   Alex Nowak-Vila, et al.
0

The problem of devising learning strategies for discrete losses (e.g., multilabeling, ranking) is currently addressed with methods and theoretical analyses ad-hoc for each loss. In this paper we study a least-squares framework to systematically design learning algorithms for discrete losses, with quantitative characterizations in terms of statistical and computational complexity. In particular we improve existing results by providing explicit dependence on the number of labels for a wide class of losses and faster learning rates in conditions of low-noise. Theoretical results are complemented with experiments on real datasets, showing the effectiveness of the proposed general approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/10/2021

Rethinking and Reweighting the Univariate Losses for Multi-Label Ranking: Consistency and Generalization

(Partial) ranking loss is a commonly used evaluation measure for multi-l...
research
12/06/2018

Theoretical Guarantees of Deep Embedding Losses Under Label Noise

Collecting labeled data to train deep neural networks is costly and even...
research
06/10/2021

Leveraged Weighted Loss for Partial Label Learning

As an important branch of weakly supervised learning, partial label lear...
research
12/14/2020

A Perturbation Resilient Framework for Unsupervised Learning

Designing learning algorithms that are resistant to perturbations of the...
research
01/30/2022

Do We Need to Penalize Variance of Losses for Learning with Label Noise?

Algorithms which minimize the averaged loss have been widely designed fo...
research
01/27/2023

LegendreTron: Uprising Proper Multiclass Loss Learning

Loss functions serve as the foundation of supervised learning and are of...
research
02/20/2020

Learning with Differentiable Perturbed Optimizers

Machine learning pipelines often rely on optimization procedures to make...

Please sign up or login with your details

Forgot password? Click here to reset