Double Descent of Discrepancy: A Task-, Data-, and Model-Agnostic Phenomenon

05/25/2023
by   Yifan Luo, et al.
0

In this paper, we studied two identically-trained neural networks (i.e. networks with the same architecture, trained on the same dataset using the same algorithm, but with different initialization) and found that their outputs discrepancy on the training dataset exhibits a "double descent" phenomenon. We demonstrated through extensive experiments across various tasks, datasets, and network architectures that this phenomenon is prevalent. Leveraging this phenomenon, we proposed a new early stopping criterion and developed a new method for data quality assessment. Our results show that a phenomenon-driven approach can benefit deep learning research both in theoretical understanding and practical applications.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset