An Exploration into why Output Regularization Mitigates Label Noise

04/26/2021
by   Neta Shoham, et al.
0

Label noise presents a real challenge for supervised learning algorithms. Consequently, mitigating label noise has attracted immense research in recent years. Noise robust losses is one of the more promising approaches for dealing with label noise, as these methods only require changing the loss function and do not require changing the design of the classifier itself, which can be expensive in terms of development time. In this work we focus on losses that use output regularization (such as label smoothing and entropy). Although these losses perform well in practice, their ability to mitigate label noise lack mathematical rigor. In this work we aim at closing this gap by showing that losses, which incorporate an output regularization term, become symmetric as the regularization coefficient goes to infinity. We argue that the regularization coefficient can be seen as a hyper-parameter controlling the symmetricity, and thus, the noise robustness of the loss function.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/28/2022

The Fisher-Rao Loss for Learning under Label Noise

Choosing a suitable loss function is essential when learning by empirica...
research
12/08/2022

Logit Clipping for Robust Learning against Label Noise

In the presence of noisy labels, designing robust loss functions is crit...
research
08/24/2022

Self-Filtering: A Noise-Aware Sample Selection for Label Noise with Confidence Penalization

Sample selection is an effective strategy to mitigate the effect of labe...
research
08/23/2023

ACLS: Adaptive and Conditional Label Smoothing for Network Calibration

We address the problem of network calibration adjusting miscalibrated co...
research
07/13/2020

TrustNet: Learning from Trusted Data Against (A)symmetric Label Noise

Robustness to label noise is a critical property for weakly-supervised c...
research
01/30/2022

Do We Need to Penalize Variance of Losses for Learning with Label Noise?

Algorithms which minimize the averaged loss have been widely designed fo...
research
07/08/2022

A law of adversarial risk, interpolation, and label noise

In supervised learning, it has been shown that label noise in the data c...

Please sign up or login with your details

Forgot password? Click here to reset