Learning Noise Transition Matrix from Only Noisy Labels via Total Variation Regularization

02/04/2021
by   Yivan Zhang, et al.
11

Many weakly supervised classification methods employ a noise transition matrix to capture the class-conditional label corruption. To estimate the transition matrix from noisy data, existing methods often need to estimate the noisy class-posterior, which could be unreliable due to the overconfidence of neural networks. In this work, we propose a theoretically grounded method that can estimate the noise transition matrix and learn a classifier simultaneously, without relying on the error-prone noisy class-posterior estimation. Concretely, inspired by the characteristics of the stochastic label corruption process, we propose total variation regularization, which encourages the predicted probabilities to be more distinguishable from each other. Under mild assumptions, the proposed method yields a consistent estimator of the transition matrix. We show the effectiveness of the proposed method through experiments on benchmark and real-world datasets.

READ FULL TEXT

page 2

page 3

page 20

page 21

page 22

research
06/14/2020

Dual T: Reducing Estimation Error for Transition Matrix in Label-noise Learning

The transition matrix, denoting the transition relationship from clean l...
research
05/02/2022

From Noisy Prediction to True Label: Noisy Prediction Calibration via Generative Model

Noisy labels are inevitable yet problematic in machine learning society....
research
03/26/2020

Matrix Smoothing: A Regularization for DNN with Transition Matrix under Noisy Labels

Training deep neural networks (DNNs) in the presence of noisy labels is ...
research
02/19/2023

Latent Class-Conditional Noise Model

Learning with noisy labels has become imperative in the Big Data era, wh...
research
02/04/2021

Provably End-to-end Label-Noise Learning without Anchor Points

In label-noise learning, the transition matrix plays a key role in build...
research
03/06/2019

Safeguarded Dynamic Label Regression for Generalized Noisy Supervision

Learning with noisy labels, which aims to reduce expensive labors on acc...
research
02/25/2023

Complementary to Multiple Labels: A Correlation-Aware Correction Approach

Complementary label learning (CLL) requires annotators to give irrelevan...

Please sign up or login with your details

Forgot password? Click here to reset