Neural Complexity Measures

08/07/2020 ∙ by Yoonho Lee, et al. ∙ 24

While various complexity measures for diverse model classes have been proposed, specifying an appropriate measure capable of predicting and explaining generalization in deep networks has proven to be challenging. We propose Neural Complexity (NC), an alternative data-driven approach that meta-learns a scalar complexity measure through interactions with a large number of heterogeneous tasks. The trained NC model can be added to the standard training loss to regularize any task learner under standard learning frameworks. We contrast NC's approach against existing manually-designed complexity measures and also against other meta-learning models, and validate NC's performance on multiple regression and classification tasks.



There are no comments yet.


page 1

page 2

page 3

page 4

Code Repositories


Official repository, Neural Complexity Measures (NeurIPS 2020)

view repo
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.