DeepAI AI Chat
Log In Sign Up

Cooperative Learning for Noisy Supervision

by   Hao Wu, et al.
Shanghai Jiao Tong University

Learning with noisy labels has gained the enormous interest in the robust deep learning area. Recent studies have empirically disclosed that utilizing dual networks can enhance the performance of single network but without theoretic proof. In this paper, we propose Cooperative Learning (CooL) framework for noisy supervision that analytically explains the effects of leveraging dual or multiple networks. Specifically, the simple but efficient combination in CooL yields a more reliable risk minimization for unseen clean data. A range of experiments have been conducted on several benchmarks with both synthetic and real-world settings. Extensive results indicate that CooL outperforms several state-of-the-art methods.


page 1

page 2

page 3

page 4


Collaborative Label Correction via Entropy Thresholding

Deep neural networks (DNNs) have the capacity to fit extremely noisy lab...

Deep Self-Learning From Noisy Labels

ConvNets achieve good results when training from clean data, but learnin...

Which Strategies Matter for Noisy Label Classification? Insight into Loss and Uncertainty

Label noise is a critical factor that degrades the generalization perfor...

Over-Fit: Noisy-Label Detection based on the Overfitted Model Property

Due to the increasing need to handle the noisy label problem in a massiv...

Style-Hallucinated Dual Consistency Learning for Domain Generalized Semantic Segmentation

In this paper, we study the task of synthetic-to-real domain generalized...

Robust AUC Optimization under the Supervision of Clean Data

AUC (area under the ROC curve) optimization algorithms have drawn much a...

Suppressing Mislabeled Data via Grouping and Self-Attention

Deep networks achieve excellent results on large-scale clean data but de...