Regularisation of Neural Networks by Enforcing Lipschitz Continuity

04/12/2018
by   Henry Gouk, et al.
0

We investigate the effect of explicitly enforcing the Lipschitz continuity of neural networks. Our main hypothesis is that constraining the Lipschitz constant of a networks will have a regularising effect. To this end, we provide a simple technique for computing the Lipschitz constant of a feed forward neural network composed of commonly used layer types. This technique is then utilised to formulate training a Lipschitz continuous neural network as a constrained optimisation problem, which can be easily solved using projected stochastic gradient methods. Our evaluation study shows that, in isolation, our method performs comparatively to state-of-the-art regularisation techniques. Moreover, when combined with existing approaches to regularising neural networks the performance gains are cumulative.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/29/2020

Lipschitz neural networks are dense in the set of all Lipschitz functions

This note shows that, for a fixed Lipschitz constant L > 0, one layer ne...
research
05/07/2020

Lifted Regression/Reconstruction Networks

In this work we propose lifted regression/reconstruction networks (LRRNs...
research
04/16/2018

MaxGain: Regularisation of Neural Networks by Constraining Activation Magnitudes

Effective regularisation of neural networks is essential to combat overf...
research
04/12/2019

The coupling effect of Lipschitz regularization in deep neural networks

We investigate robustness of deep feed-forward neural networks when inpu...
research
02/21/2023

Some Fundamental Aspects about Lipschitz Continuity of Neural Network Functions

Lipschitz continuity is a simple yet pivotal functional property of any ...
research
04/27/2020

Estimating Full Lipschitz Constants of Deep Neural Networks

We estimate the Lipschitz constants of the gradient of a deep neural net...
research
03/08/2021

Lipschitz Normalization for Self-Attention Layers with Application to Graph Neural Networks

Attention based neural networks are state of the art in a large range of...

Please sign up or login with your details

Forgot password? Click here to reset