Tripartite: Tackle Noisy Labels by a More Precise Partition

02/19/2022
by   Xuefeng Liang, et al.
0

Samples in large-scale datasets may be mislabeled due to various reasons, and Deep Neural Networks can easily over-fit to the noisy label data. To tackle this problem, the key point is to alleviate the harm of these noisy labels. Many existing methods try to divide training data into clean and noisy subsets in terms of loss values, and then process the noisy label data varied. One of the reasons hindering a better performance is the hard samples. As hard samples always have relatively large losses whether their labels are clean or noisy, these methods could not divide them precisely. Instead, we propose a Tripartite solution to partition training data more precisely into three subsets: hard, noisy, and clean. The partition criteria are based on the inconsistent predictions of two networks, and the inconsistency between the prediction of a network and the given label. To minimize the harm of noisy labels but maximize the value of noisy label data, we apply a low-weight learning on hard data and a self-supervised learning on noisy label data without using the given labels. Extensive experiments demonstrate that Tripartite can filter out noisy label data more precisely, and outperforms most state-of-the-art methods on five benchmark datasets, especially on real-world datasets.

READ FULL TEXT

page 2

page 12

research
06/14/2021

Over-Fit: Noisy-Label Detection based on the Overfitted Model Property

Due to the increasing need to handle the noisy label problem in a massiv...
research
07/31/2023

LaplaceConfidence: a Graph-based Approach for Learning with Noisy Labels

In real-world applications, perfect labels are rarely available, making ...
research
08/23/2022

Learning from Noisy Labels with Coarse-to-Fine Sample Credibility Modeling

Training deep neural network (DNN) with noisy labels is practically chal...
research
12/09/2020

A Topological Filter for Learning with Label Noise

Noisy labels can impair the performance of deep neural networks. To tack...
research
03/13/2021

Ensemble Learning with Manifold-Based Data Splitting for Noisy Label Correction

Label noise in training data can significantly degrade a model's general...
research
03/07/2017

Learning from Noisy Labels with Distillation

The ability of learning from noisy labels is very useful in many visual ...
research
11/19/2022

Robust AUC Optimization under the Supervision of Clean Data

AUC (area under the ROC curve) optimization algorithms have drawn much a...

Please sign up or login with your details

Forgot password? Click here to reset