Instance-specific Label Distribution Regularization for Learning with Label Noise

12/16/2022
by   Zehui Liao, et al.
0

Modeling noise transition matrix is a kind of promising method for learning with label noise. Based on the estimated noise transition matrix and the noisy posterior probabilities, the clean posterior probabilities, which are jointly called Label Distribution (LD) in this paper, can be calculated as the supervision. To reliably estimate the noise transition matrix, some methods assume that anchor points are available during training. Nonetheless, if anchor points are invalid, the noise transition matrix might be poorly learned, resulting in poor performance. Consequently, other methods treat reliable data points, extracted from training data, as pseudo anchor points. However, from a statistical point of view, the noise transition matrix can be inferred from data with noisy labels under the clean-label-domination assumption. Therefore, we aim to estimate the noise transition matrix without (pseudo) anchor points. There is evidence showing that samples are more likely to be mislabeled as other similar class labels, which means the mislabeling probability is highly correlated with the inter-class correlation. Inspired by this observation, we propose an instance-specific Label Distribution Regularization (LDR), in which the instance-specific LD is estimated as the supervision, to prevent DCNNs from memorizing noisy labels. Specifically, we estimate the noisy posterior under the supervision of noisy labels, and approximate the batch-level noise transition matrix by estimating the inter-class correlation matrix with neither anchor points nor pseudo anchor points. Experimental results on two synthetic noisy datasets and two real-world noisy datasets demonstrate that our LDR outperforms existing methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/01/2019

Are Anchor Points Really Indispensable in Label-Noise Learning?

In label-noise learning, noise transition matrix, denoting the probabili...
research
06/10/2020

Meta Transition Adaptation for Robust Deep Learning with Noisy Labels

To discover intrinsic inter-class transition probabilities underlying da...
research
03/03/2021

Statistical Hypothesis Testing for Class-Conditional Label Noise

In this work we aim to provide machine learning practitioners with tools...
research
09/14/2023

Anchor Points: Benchmarking Models with Much Fewer Examples

Modern language models often exhibit powerful but brittle behavior, lead...
research
02/04/2021

Provably End-to-end Label-Noise Learning without Anchor Points

In label-noise learning, the transition matrix plays a key role in build...
research
11/29/2021

Learning with Noisy Labels by Efficient Transition Matrix Estimation to Combat Label Miscorrection

Recent studies on learning with noisy labels have shown remarkable perfo...
research
03/15/2020

NoiseRank: Unsupervised Label Noise Reduction with Dependence Models

Label noise is increasingly prevalent in datasets acquired from noisy ch...

Please sign up or login with your details

Forgot password? Click here to reset