Predicting Generalization in Deep Learning via Local Measures of Distortion

12/13/2020
by   Abhejit Rajagopal, et al.
0

We study generalization in deep learning by appealing to complexity measures originally developed in approximation and information theory. While these concepts are challenged by the high-dimensional and data-defined nature of deep learning, we show that simple vector quantization approaches such as PCA, GMMs, and SVMs capture their spirit when applied layer-wise to deep extracted features giving rise to relatively inexpensive complexity measures that correlate well with generalization performance. We discuss our results in 2020 NeurIPS PGDL challenge.

READ FULL TEXT

page 2

page 4

research
12/14/2020

NeurIPS 2020 Competition: Predicting Generalization in Deep Learning

Understanding generalization in deep learning is arguably one of the mos...
research
12/04/2020

Representation Based Complexity Measures for Predicting Generalization in Deep Learning

Deep Neural Networks can generalize despite being significantly overpara...
research
12/16/2020

Using noise resilience for ranking generalization of deep neural networks

Recent papers have shown that sufficiently overparameterized neural netw...
research
08/10/2020

Measures of Complexity for Large Scale Image Datasets

Large scale image datasets are a growing trend in the field of machine l...
research
08/08/2022

On Rademacher Complexity-based Generalization Bounds for Deep Learning

In this paper, we develop some novel bounds for the Rademacher complexit...
research
03/04/2021

Evaluation of Complexity Measures for Deep Learning Generalization in Medical Image Analysis

The generalization performance of deep learning models for medical image...
research
12/16/2020

Predicting Generalization in Deep Learning via Metric Learning – PGDL Shared task

The competition "Predicting Generalization in Deep Learning (PGDL)" aims...

Please sign up or login with your details

Forgot password? Click here to reset