Stochastic Negative Mining for Learning with Large Output Spaces

10/16/2018
by   Sashank J Reddi, et al.
0

We consider the problem of retrieving the most relevant labels for a given input when the size of the output space is very large. Retrieval methods are modeled as set-valued classifiers which output a small set of classes for each input, and a mistake is made if the label is not in the output set. Despite its practical importance, a statistically principled, yet practical solution to this problem is largely missing. To this end, we first define a family of surrogate losses and show that they are calibrated and convex under certain conditions on the loss parameters and data distribution, thereby establishing a statistical and analytical basis for using these losses. Furthermore, we identify a particularly intuitive class of loss functions in the aforementioned family and show that they are amenable to practical implementation in the large output space setting (i.e. computation is possible without evaluating scores of all labels) by developing a technique called Stochastic Negative Mining. We also provide generalization error bounds for the losses in the family. Finally, we conduct experiments which demonstrate that Stochastic Negative Mining yields benefits over commonly used negative sampling approaches.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/27/2012

Minimizing The Misclassification Error Rate Using a Surrogate Convex Loss

We carefully study how well minimizing convex surrogate loss functions, ...
research
06/27/2012

Consistent Multilabel Ranking through Univariate Losses

We consider the problem of rank loss minimization in the setting of mult...
research
04/23/2020

Doubly-stochastic mining for heterogeneous retrieval

Modern retrieval problems are characterised by training sets with potent...
research
10/24/2019

Structured Prediction with Projection Oracles

We propose in this paper a general framework for deriving loss functions...
research
10/26/2021

Alpha-IoU: A Family of Power Intersection over Union Losses for Bounding Box Regression

Bounding box (bbox) regression is a fundamental task in computer vision....
research
05/12/2021

Disentangling Sampling and Labeling Bias for Learning in Large-Output Spaces

Negative sampling schemes enable efficient training given a large number...
research
06/01/2021

Sample Selection with Uncertainty of Losses for Learning with Noisy Labels

In learning with noisy labels, the sample selection approach is very pop...

Please sign up or login with your details

Forgot password? Click here to reset