Understanding Memorization from the Perspective of Optimization via Efficient Influence Estimation

12/16/2021
by   Futong Liu, et al.
0

Over-parameterized deep neural networks are able to achieve excellent training accuracy while maintaining a small generalization error. It has also been found that they are able to fit arbitrary labels, and this behaviour is referred to as the phenomenon of memorization. In this work, we study the phenomenon of memorization with turn-over dropout, an efficient method to estimate influence and memorization, for data with true labels (real data) and data with random labels (random data). Our main findings are: (i) For both real data and random data, the optimization of easy examples (e.g., real data) and difficult examples (e.g., random data) are conducted by the network simultaneously, with easy ones at a higher speed; (ii) For real data, a correct difficult example in the training dataset is more informative than an easy one. By showing the existence of memorization on random data and real data, we highlight the consistency between them regarding optimization and we emphasize the implication of memorization during optimization.

READ FULL TEXT

page 11

page 12

research
05/27/2019

Understanding Generalization of Deep Neural Networks Trained with Noisy Labels

Over-parameterized deep neural networks trained by simple first-order me...
research
12/08/2020

Efficient Estimation of Influence of a Training Instance

Understanding the influence of a training instance on a neural network m...
research
07/11/2019

Cross-Domain Complementary Learning with Synthetic Data for Multi-Person Part Segmentation

The success of supervised deep learning depends on the training labels. ...
research
09/11/2021

MLReal: Bridging the gap between training on synthetic data and real data applications in machine learning

Among the biggest challenges we face in utilizing neural networks traine...
research
08/09/2020

What Neural Networks Memorize and Why: Discovering the Long Tail via Influence Estimation

Deep learning algorithms are well-known to have a propensity for fitting...
research
03/24/2020

Robust and On-the-fly Dataset Denoising for Image Classification

Memorization in over-parameterized neural networks could severely hurt g...
research
06/09/2019

Robust conditional GANs under missing or uncertain labels

Matching the performance of conditional Generative Adversarial Networks ...

Please sign up or login with your details

Forgot password? Click here to reset