DeepAI AI Chat
Log In Sign Up

Uniform Generalization Bounds for Overparameterized Neural Networks

09/13/2021
by   Sattar Vakili, et al.
0

An interesting observation in artificial neural networks is their favorable generalization error despite typically being extremely overparameterized. It is well known that classical statistical learning methods often result in vacuous generalization errors in the case of overparameterized neural networks. Adopting the recently developed Neural Tangent (NT) kernel theory, we prove uniform generalization bounds for overparameterized neural networks in kernel regimes, when the true data generating model belongs to the reproducing kernel Hilbert space (RKHS) corresponding to the NT kernel. Importantly, our bounds capture the exact error rates depending on the differentiability of the activation functions. In order to establish these bounds, we propose the information gain of the NT kernel as a measure of complexity of the learning problem. Our analysis uses a Mercer decomposition of the NT kernel in the basis of spherical harmonics and the decay rate of the corresponding eigenvalues. As a byproduct of our results, we show the equivalence between the RKHS corresponding to the NT kernel and its counterpart corresponding to the Matérn family of kernels, that induces a very general class of models. We further discuss the implications of our analysis for some recent results on the regret bounds for reinforcement learning algorithms, which use overparameterized neural networks.

READ FULL TEXT
09/20/2021

Understanding neural networks with reproducing kernel Banach spaces

Characterizing the function spaces corresponding to neural networks can ...
09/22/2021

Robust Generalization of Quadratic Neural Networks via Function Identification

A key challenge facing deep learning is that neural networks are often n...
07/02/2020

A Revision of Neural Tangent Kernel-based Approaches for Neural Networks

Recent theoretical works based on the neural tangent kernel (NTK) have s...
05/15/2022

Generalization Bounds on Multi-Kernel Learning with Mixed Datasets

This paper presents novel generalization bounds for the multi-kernel lea...
06/09/2017

Group Invariance, Stability to Deformations, and Complexity of Deep Convolutional Representations

In this paper, we study deep signal representations that are invariant t...
04/07/2021

Spectral Analysis of the Neural Tangent Kernel for Deep Residual Networks

Deep residual network architectures have been shown to achieve superior ...
09/17/2020

A Principle of Least Action for the Training of Neural Networks

Neural networks have been achieving high generalization performance on m...