Estimating Full Lipschitz Constants of Deep Neural Networks

04/27/2020
by   Calypso Herrera, et al.
0

We estimate the Lipschitz constants of the gradient of a deep neural network and the network itself with respect to the full set of parameters. We first develop estimates for a deep feed-forward densely connected network and then, in a more general framework, for all neural networks that can be represented as solutions of controlled ordinary differential equations, where time appears as continuous depth. These estimates can be used to set the step size of stochastic gradient descent methods, which is illustrated for one example method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/12/2018

Regularisation of Neural Networks by Enforcing Lipschitz Continuity

We investigate the effect of explicitly enforcing the Lipschitz continui...
research
03/22/2018

Residual Networks: Lyapunov Stability and Convex Decomposition

While training error of most deep neural networks degrades as the depth ...
research
10/08/2019

Differentiable Sparsification for Deep Neural Networks

A deep neural network has relieved the burden of feature engineering by ...
research
08/04/2018

On Lipschitz Bounds of General Convolutional Neural Networks

Many convolutional neural networks (CNNs) have a feed-forward structure....
research
04/12/2019

The coupling effect of Lipschitz regularization in deep neural networks

We investigate robustness of deep feed-forward neural networks when inpu...
research
04/16/2018

MaxGain: Regularisation of Neural Networks by Constraining Activation Magnitudes

Effective regularisation of neural networks is essential to combat overf...

Please sign up or login with your details

Forgot password? Click here to reset