Theoretical Guarantees of Deep Embedding Losses Under Label Noise

12/06/2018
by   Nam Le, et al.
0

Collecting labeled data to train deep neural networks is costly and even impractical for many tasks. Thus, research effort has been focused in automatically curated datasets or unsupervised and weakly supervised learning. The common problem in these directions is learning with unreliable label information. In this paper, we address the tolerance of deep embedding learning losses against label noise, i.e. when the observed labels are different from the true labels. Specifically, we provide the sufficient conditions to achieve theoretical guarantees for the 2 common loss functions: marginal loss and triplet loss. From these theoretical results, we can estimate how sampling strategies and initialization can affect the level of resistance against label noise. The analysis also helps providing more effective guidelines in unsupervised and weakly supervised deep embedding learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/08/2016

Loss factorization, weakly supervised learning and label noise robustness

We prove that the empirical risk of most well-known loss functions facto...
research
12/13/2022

Losses over Labels: Weakly Supervised Learning via Direct Loss Construction

Owing to the prohibitive costs of generating large amounts of labeled da...
research
03/04/2021

Lower-bounded proper losses for weakly supervised classification

This paper discusses the problem of weakly supervised learning of classi...
research
10/16/2018

Sharp Analysis of Learning with Discrete Losses

The problem of devising learning strategies for discrete losses (e.g., m...
research
06/11/2021

On the Robustness of Average Losses for Partial-Label Learning

Partial-label (PL) learning is a typical weakly supervised classificatio...
research
02/06/2023

Easy Learning from Label Proportions

We consider the problem of Learning from Label Proportions (LLP), a weak...
research
08/03/2022

Noise tolerance of learning to rank under class-conditional label noise

Often, the data used to train ranking models is subject to label noise. ...

Please sign up or login with your details

Forgot password? Click here to reset