On Rademacher Complexity-based Generalization Bounds for Deep Learning

08/08/2022
by   Lan V. Truong, et al.
0

In this paper, we develop some novel bounds for the Rademacher complexity and the generalization error in deep learning with i.i.d. and Markov datasets. The new Rademacher complexity and generalization bounds are tight up to O(1/√(n)) where n is the size of the training set. They can be exponentially decayed in the depth L for some neural network structures. The development of Talagrand's contraction lemmas for high-dimensional mappings between function spaces and deep neural networks for general activation functions is a key technical contribution to this work.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/24/2022

PAC-Bayes Compression Bounds So Tight That They Can Explain Generalization

While there has been progress in developing non-vacuous generalization b...
research
12/23/2021

Generalization Error Bounds on Deep Learning with Markov Datasets

In this paper, we derive upper bounds on generalization errors for deep ...
research
07/25/2019

Fast generalization error bound of deep learning without scale invariance of activation functions

In theoretical analysis of deep learning, discovering which features of ...
research
06/20/2016

A New Training Method for Feedforward Neural Networks Based on Geometric Contraction Property of Activation Functions

We propose a new training method for a feedforward neural network having...
research
05/09/2019

Data-dependent Sample Complexity of Deep Neural Networks via Lipschitz Augmentation

Existing Rademacher complexity bounds for neural networks rely only on n...
research
12/13/2020

Predicting Generalization in Deep Learning via Local Measures of Distortion

We study generalization in deep learning by appealing to complexity meas...
research
05/30/2023

Benign Overfitting in Deep Neural Networks under Lazy Training

This paper focuses on over-parameterized deep neural networks (DNNs) wit...

Please sign up or login with your details

Forgot password? Click here to reset