DeepAI AI Chat
Log In Sign Up

Noise-Robust Bidirectional Learning with Dynamic Sample Reweighting

by   Chen-Chen Zong, et al.
Nanjing University of Aeronautics and Astronautics

Deep neural networks trained with standard cross-entropy loss are more prone to memorize noisy labels, which degrades their performance. Negative learning using complementary labels is more robust when noisy labels intervene but with an extremely slow model convergence speed. In this paper, we first introduce a bidirectional learning scheme, where positive learning ensures convergence speed while negative learning robustly copes with label noise. Further, a dynamic sample reweighting strategy is proposed to globally weaken the effect of noise-labeled samples by exploiting the excellent discriminatory ability of negative learning on the sample probability distribution. In addition, we combine self-distillation to further improve the model performance. The code is available at <>.


page 1

page 2

page 3

page 4


Multi-Objective Interpolation Training for Robustness to Label Noise

Deep neural networks trained with standard cross-entropy loss memorize n...

Twin Contrastive Learning with Noisy Labels

Learning from noisy data is a challenging task that significantly degene...

Model and Data Agreement for Learning with Noisy Labels

Learning with noisy labels is a vital topic for practical deep learning ...

Learning with Noisy Labels via Sparse Regularization

Learning with noisy labels is an important and challenging task for trai...

Avoiding spurious correlations via logit correction

Empirical studies suggest that machine learning models trained with empi...

Unsupervised label noise modeling and loss correction

Despite being robust to small amounts of label noise, convolutional neur...

ProSelfLC: Progressive Self Label Correction Towards A Low-Temperature Entropy State

To train robust deep neural networks (DNNs), we systematically study sev...