Analysis of classifiers robust to noisy labels

06/01/2021
by   Alex Díaz, et al.
0

We explore contemporary robust classification algorithms for overcoming class-dependant labelling noise: Forward, Importance Re-weighting and T-revision. The classifiers are trained and evaluated on class-conditional random label noise data while the final test data is clean. We demonstrate methods for estimating the transition matrix in order to obtain better classifier performance when working with noisy data. We apply deep learning to three data-sets and derive an end-to-end analysis with unknown noise on the CIFAR data-set from scratch. The effectiveness and robustness of the classifiers are analysed, and we compare and contrast the results of each experiment are using top-1 accuracy as our criterion.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/14/2020

Dual T: Reducing Estimation Error for Transition Matrix in Label-noise Learning

The transition matrix, denoting the transition relationship from clean l...
research
03/07/2014

Becoming More Robust to Label Noise with Classifier Diversity

It is widely known in the machine learning community that class noise ca...
research
06/07/2022

Inferring Unfairness and Error from Population Statistics in Binary and Multiclass Classification

We propose methods for making inferences on the fairness and accuracy of...
research
05/21/2018

Masking: A New Perspective of Noisy Supervision

It is important to learn classifiers under noisy labels due to their ubi...
research
03/06/2019

Safeguarded Dynamic Label Regression for Generalized Noisy Supervision

Learning with noisy labels, which aims to reduce expensive labors on acc...
research
05/29/2018

Classification with imperfect training labels

We study the effect of imperfect training data labels on the performance...
research
09/10/2019

Boosting Classifiers with Noisy Inference

We present a principled framework to address resource allocation for rea...

Please sign up or login with your details

Forgot password? Click here to reset