Skeptical Deep Learning with Distribution Correction

11/09/2018
by   Mingxiao An, et al.
0

Recently deep neural networks have been successfully used for various classification tasks, especially for problems with massive perfectly labeled training data. However, it is often costly to have large-scale credible labels in real-world applications. One solution is to make supervised learning robust with imperfectly labeled input. In this paper, we develop a distribution correction approach that allows deep neural networks to avoid overfitting imperfect training data. Specifically, we treat the noisy input as samples from an incorrect distribution, which will be automatically corrected during our training process. We test our approach on several classification datasets with elaborately generated noisy labels. The results show significantly higher prediction and recovery accuracy with our approach compared to alternative methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/19/2020

Error-Bounded Correction of Noisy Labels

To collect large scale annotated data, it is inevitable to introduce lab...
research
02/11/2020

A Non-Intrusive Correction Algorithm for Classification Problems with Corrupted Data

A novel correction algorithm is proposed for multi-class classification ...
research
09/20/2019

A Simple yet Effective Baseline for Robust Deep Learning with Noisy Labels

Recently deep neural networks have shown their capacity to memorize trai...
research
09/06/2017

Deep learning from crowds

Over the last few years, deep learning has revolutionized the field of m...
research
01/19/2019

Overfitting Mechanism and Avoidance in Deep Neural Networks

Assisted by the availability of data and high performance computing, dee...
research
07/10/2023

Robust Feature Learning Against Noisy Labels

Supervised learning of deep neural networks heavily relies on large-scal...
research
03/28/2021

Friends and Foes in Learning from Noisy Labels

Learning from examples with noisy labels has attracted increasing attent...

Please sign up or login with your details

Forgot password? Click here to reset