Bayes Complexity of Learners vs Overfitting

03/13/2023
by   Grzegorz Głuch, et al.
0

We introduce a new notion of complexity of functions and we show that it has the following properties: (i) it governs a PAC Bayes-like generalization bound, (ii) for neural networks it relates to natural notions of complexity of functions (such as the variation), and (iii) it explains the generalization gap between neural networks and linear schemes. While there is a large set of papers which describes bounds that have each such property in isolation, and even some that have two, as far as we know, this is a first notion that satisfies all three of them. Moreover, in contrast to previous works, our notion naturally generalizes to neural networks with several layers. Even though the computation of our complexity is nontrivial in general, an upper-bound is often easy to derive, even for higher number of layers and functions with structure, such as period functions. An upper-bound we derive allows to show a separation in the number of samples needed for good generalization between 2 and 4-layer neural networks for periodic functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/10/2018

Generalization of an Upper Bound on the Number of Nodes Needed to Achieve Linear Separability

An important issue in neural network research is how to choose the numbe...
research
06/16/2020

Measuring Model Complexity of Neural Networks with Curve Activation Functions

It is fundamental to measure model complexity of deep neural networks. T...
research
12/14/2020

A PAC-Bayesian Approach to Generalization Bounds for Graph Neural Networks

In this paper, we derive generalization bounds for the two primary class...
research
12/04/2022

Understanding Sinusoidal Neural Networks

In this work, we investigate the representation capacity of multilayer p...
research
09/10/2009

Chromatic PAC-Bayes Bounds for Non-IID Data: Applications to Ranking and Stationary β-Mixing Processes

Pac-Bayes bounds are among the most accurate generalization bounds for c...
research
04/02/2019

Why ResNet Works? Residuals Generalize

Residual connections significantly boost the performance of deep neural ...
research
08/20/2017

A Capacity Scaling Law for Artificial Neural Networks

By assuming an ideal neural network with gating functions handling the w...

Please sign up or login with your details

Forgot password? Click here to reset