DeepAI AI Chat
Log In Sign Up

Cooperative Learning for Noisy Supervision

08/11/2021
by   Hao Wu, et al.
Shanghai Jiao Tong University
0

Learning with noisy labels has gained the enormous interest in the robust deep learning area. Recent studies have empirically disclosed that utilizing dual networks can enhance the performance of single network but without theoretic proof. In this paper, we propose Cooperative Learning (CooL) framework for noisy supervision that analytically explains the effects of leveraging dual or multiple networks. Specifically, the simple but efficient combination in CooL yields a more reliable risk minimization for unseen clean data. A range of experiments have been conducted on several benchmarks with both synthetic and real-world settings. Extensive results indicate that CooL outperforms several state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

03/31/2021

Collaborative Label Correction via Entropy Thresholding

Deep neural networks (DNNs) have the capacity to fit extremely noisy lab...
08/06/2019

Deep Self-Learning From Noisy Labels

ConvNets achieve good results when training from clean data, but learnin...
08/14/2020

Which Strategies Matter for Noisy Label Classification? Insight into Loss and Uncertainty

Label noise is a critical factor that degrades the generalization perfor...
06/14/2021

Over-Fit: Noisy-Label Detection based on the Overfitted Model Property

Due to the increasing need to handle the noisy label problem in a massiv...
04/06/2022

Style-Hallucinated Dual Consistency Learning for Domain Generalized Semantic Segmentation

In this paper, we study the task of synthetic-to-real domain generalized...
11/19/2022

Robust AUC Optimization under the Supervision of Clean Data

AUC (area under the ROC curve) optimization algorithms have drawn much a...
10/29/2020

Suppressing Mislabeled Data via Grouping and Self-Attention

Deep networks achieve excellent results on large-scale clean data but de...