MentorNet: Regularizing Very Deep Neural Networks on Corrupted Labels

12/14/2017
by   Lu Jiang, et al.
0

Recent studies have discovered that deep networks are capable of memorizing the entire data even when the labels are completely random. Since deep models are trained on big data where labels are often noisy, the ability to overfit noise can lead to poor performance. To overcome the overfitting on corrupted training data, we propose a novel technique to regularize deep networks in the data dimension. This is achieved by learning a neural network called MentorNet to supervise the training of the base network, namely, StudentNet. Our work is inspired by curriculum learning and advances the theory by learning a curriculum from data by neural networks. We demonstrate the efficacy of MentorNet on several benchmarks. Comprehensive experiments show that it is able to significantly improve the generalization performance of the state-of-the-art deep networks on corrupted training data.

READ FULL TEXT

page 8

page 9

research
05/07/2021

Self-paced Resistance Learning against Overfitting on Noisy Labels

Noisy labels composed of correct and corrupted ones are pervasive in pra...
research
11/06/2019

Searching to Exploit Memorization Effect in Learning from Corrupted Labels

Sample-selection approaches, which attempt to pick up clean instances fr...
research
09/06/2019

Mass Personalization of Deep Learning

We discuss training techniques, objectives and metrics toward mass perso...
research
12/25/2019

SketchTransfer: A Challenging New Task for Exploring Detail-Invariance and the Abstractions Learned by Deep Networks

Deep networks have achieved excellent results in perceptual tasks, yet t...
research
04/16/2021

Deep Stable Learning for Out-Of-Distribution Generalization

Approaches based on deep neural networks have achieved striking performa...
research
01/05/2022

Corrupting Data to Remove Deceptive Perturbation: Using Preprocessing Method to Improve System Robustness

Although deep neural networks have achieved great performance on classif...

Please sign up or login with your details

Forgot password? Click here to reset