Cooperative Learning for Noisy Supervision

08/11/2021
by   Hao Wu, et al.
0

Learning with noisy labels has gained the enormous interest in the robust deep learning area. Recent studies have empirically disclosed that utilizing dual networks can enhance the performance of single network but without theoretic proof. In this paper, we propose Cooperative Learning (CooL) framework for noisy supervision that analytically explains the effects of leveraging dual or multiple networks. Specifically, the simple but efficient combination in CooL yields a more reliable risk minimization for unseen clean data. A range of experiments have been conducted on several benchmarks with both synthetic and real-world settings. Extensive results indicate that CooL outperforms several state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/31/2021

Collaborative Label Correction via Entropy Thresholding

Deep neural networks (DNNs) have the capacity to fit extremely noisy lab...
research
08/06/2019

Deep Self-Learning From Noisy Labels

ConvNets achieve good results when training from clean data, but learnin...
research
07/31/2023

LaplaceConfidence: a Graph-based Approach for Learning with Noisy Labels

In real-world applications, perfect labels are rarely available, making ...
research
02/17/2022

PENCIL: Deep Learning with Noisy Labels

Deep learning has achieved excellent performance in various computer vis...
research
03/29/2022

Agreement or Disagreement in Noise-tolerant Mutual Learning?

Deep learning has made many remarkable achievements in many fields but s...
research
10/29/2020

Suppressing Mislabeled Data via Grouping and Self-Attention

Deep networks achieve excellent results on large-scale clean data but de...
research
07/21/2022

Learning from Data with Noisy Labels Using Temporal Self-Ensemble

There are inevitably many mislabeled data in real-world datasets. Becaus...

Please sign up or login with your details

Forgot password? Click here to reset