Unbiased Loss Functions for Multilabel Classification with Missing Labels

09/23/2021
by   Erik Schultheis, et al.
0

This paper considers binary and multilabel classification problems in a setting where labels are missing independently and with a known rate. Missing labels are a ubiquitous phenomenon in extreme multi-label classification (XMC) tasks, such as matching Wikipedia articles to a small subset out of the hundreds of thousands of possible tags, where no human annotator can possibly check the validity of all the negative samples. For this reason, propensity-scored precision – an unbiased estimate for precision-at-k under a known noise model – has become one of the standard metrics in XMC. Few methods take this problem into account already during the training phase, and all are limited to loss functions that can be decomposed into a sum of contributions from each individual label. A typical approach to training is to reduce the multilabel problem into a series of binary or multiclass problems, and it has been shown that if the surrogate task should be consistent for optimizing recall, the resulting loss function is not decomposable over labels. Therefore, this paper derives the unique unbiased estimators for the different multilabel reductions, including the non-decomposable ones. These estimators suffer from increased variance and may lead to ill-posed optimization problems, which we address by switching to convex upper-bounds. The theoretical considerations are further supplemented by an experimental study showing that the switch to unbiased estimators significantly alters the bias-variance trade-off and may thus require stronger regularization, which in some cases can negate the benefits of unbiased estimation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/01/2020

Unbiased Loss Functions for Extreme Classification With Missing Labels

The goal in extreme multi-label classification (XMC) is to tag an instan...
research
11/27/2014

Classification with Noisy Labels by Importance Reweighting

In this paper, we study a classification problem in which sample labels ...
research
08/24/2021

sigmoidF1: A Smooth F1 Score Surrogate Loss for Multilabel Classification

Multiclass multilabel classification refers to the task of attributing m...
research
12/13/2021

Simple and Robust Loss Design for Multi-Label Learning with Missing Labels

Multi-label learning in the presence of missing labels (MLML) is a chall...
research
07/26/2022

On Missing Labels, Long-tails and Propensities in Extreme Multi-label Classification

The propensity model introduced by Jain et al. 2016 has become a standar...
research
08/05/2022

ZLPR: A Novel Loss for Multi-label Classification

In the era of deep learning, loss functions determine the range of tasks...
research
05/26/2021

The statistical advantage of automatic NLG metrics at the system level

Estimating the expected output quality of generation systems is central ...

Please sign up or login with your details

Forgot password? Click here to reset