An Analysis of the Expressiveness of Deep Neural Network Architectures Based on Their Lipschitz Constants

12/24/2019
by   Siqi Zhou, et al.
0

Deep neural networks (DNNs) have emerged as a popular mathematical tool for function approximation due to their capability of modelling highly nonlinear functions. Their applications range from image classification and natural language processing to learning-based control. Despite their empirical successes, there is still a lack of theoretical understanding of the representative power of such deep architectures. In this work, we provide a theoretical analysis of the expressiveness of fully-connected, feedforward DNNs with 1-Lipschitz activation functions. In particular, we characterize the expressiveness of a DNN by its Lipchitz constant. By leveraging random matrix theory, we show that, given sufficiently large and randomly distributed weights, the expected upper and lower bounds of the Lipschitz constant of a DNN and hence their expressiveness increase exponentially with depth and polynomially with width, which gives rise to the benefit of the depth of DNN architectures for efficient function approximation. This observation is consistent with established results based on alternative expressiveness measures of DNNs. In contrast to most of the existing work, our analysis based on the Lipschitz properties of DNNs is applicable to a wider range of activation nonlinearities and potentially allows us to make sensible comparisons between the complexity of a DNN and the function to be approximated by the DNN. We consider this work to be a step towards understanding the expressive power of DNNs and towards designing appropriate deep architectures for practical applications such as system control.

READ FULL TEXT
research
06/13/2022

Analysis of function approximation and stability of general DNNs in directed acyclic graphs using un-rectifying analysis

A general lack of understanding pertaining to deep feedforward neural ne...
research
06/12/2019

Efficient and Accurate Estimation of Lipschitz Constants for Deep Neural Networks

Tight estimation of the Lipschitz constant for deep neural networks (DNN...
research
01/08/2021

On the Turnpike to Design of Deep Neural Nets: Explicit Depth Bounds

It is well-known that the training of Deep Neural Networks (DNN) can be ...
research
05/25/2023

SING: A Plug-and-Play DNN Learning Technique

We propose SING (StabIlized and Normalized Gradient), a plug-and-play te...
research
10/01/2018

Benchmark Analysis of Representative Deep Neural Network Architectures

This work presents an in-depth analysis of the majority of the deep neur...
research
03/08/2023

Densely Connected G-invariant Deep Neural Networks with Signed Permutation Representations

We introduce and investigate, for finite groups G, G-invariant deep neur...
research
08/18/2023

Noise Sensitivity and Stability of Deep Neural Networks for Binary Classification

A first step is taken towards understanding often observed non-robustnes...

Please sign up or login with your details

Forgot password? Click here to reset