Uniform Convergence of Deep Neural Networks with Lipschitz Continuous Activation Functions and Variable Widths

06/02/2023
by   Yuesheng Xu, et al.
0

We consider deep neural networks with a Lipschitz continuous activation function and with weight matrices of variable widths. We establish a uniform convergence analysis framework in which sufficient conditions on weight matrices and bias vectors together with the Lipschitz constant are provided to ensure uniform convergence of the deep neural networks to a meaningful function as the number of their layers tends to infinity. In the framework, special results on uniform convergence of deep neural networks with a fixed width, bounded widths and unbounded widths are presented. In particular, as convolutional neural networks are special deep neural networks with weight matrices of increasing widths, we put forward conditions on the mask sequence which lead to uniform convergence of resulting convolutional neural networks. The Lipschitz continuity assumption on the activation functions allows us to include in our theory most of commonly used activation functions in applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/13/2022

Convergence of Deep Neural Networks with General Activation Functions and Pooling

Deep neural networks, as a powerful system to represent high dimensional...
research
05/03/2017

Quantified advantage of discontinuous weight selection in approximations with deep neural networks

We consider approximations of 1D Lipschitz functions by deep ReLU networ...
research
11/26/2020

Spectral Analysis and Stability of Deep Neural Dynamics

Our modern history of deep learning follows the arc of famous emergent d...
research
06/22/2020

Bidirectional Self-Normalizing Neural Networks

The problem of exploding and vanishing gradients has been a long-standin...
research
10/11/2019

The Expressivity and Training of Deep Neural Networks: toward the Edge of Chaos?

Expressivity is one of the most significant issues in assessing neural n...
research
01/17/2020

Deep Neural Networks with Trainable Activations and Controlled Lipschitz Constant

We introduce a variational framework to learn the activation functions o...
research
07/07/2020

An Integer Programming Approach to Deep Neural Networks with Binary Activation Functions

We study deep neural networks with binary activation functions (BDNN), i...

Please sign up or login with your details

Forgot password? Click here to reset