CLIP: Cheap Lipschitz Training of Neural Networks

by   Leon Bungert, et al.

Despite the large success of deep neural networks (DNN) in recent years, most neural networks still lack mathematical guarantees in terms of stability. For instance, DNNs are vulnerable to small or even imperceptible input perturbations, so called adversarial examples, that can cause false predictions. This instability can have severe consequences in applications which influence the health and safety of humans, e.g., biomedical imaging or autonomous driving. While bounding the Lipschitz constant of a neural network improves stability, most methods rely on restricting the Lipschitz constants of each layer which gives a poor bound for the actual Lipschitz constant. In this paper we investigate a variational regularization method named CLIP for controlling the Lipschitz constant of a neural network, which can easily be integrated into the training procedure. We mathematically analyze the proposed model, in particular discussing the impact of the chosen regularization parameter on the output of the network. Finally, we numerically evaluate our method on both a nonlinear regression problem and the MNIST and Fashion-MNIST classification databases, and compare our results with a weight regularization approach.


page 1

page 2

page 3

page 4


Fast Accurate Method for Bounding the Singular Values of Convolutional Layers with Application to Lipschitz Regularization

This paper tackles the problem of Lipschitz regularization of Convolutio...

Limitations of the Lipschitz constant as a defense against adversarial examples

Several recent papers have discussed utilizing Lipschitz constants to li...

Efficient and Accurate Estimation of Lipschitz Constants for Deep Neural Networks

Tight estimation of the Lipschitz constant for deep neural networks (DNN...

Deep Neural Networks with Trainable Activations and Controlled Lipschitz Constant

We introduce a variational framework to learn the activation functions o...

Lipschitz regularity of deep neural networks: analysis and efficient estimation

Deep neural networks are notorious for being sensitive to small well-cho...

The coupling effect of Lipschitz regularization in deep neural networks

We investigate robustness of deep feed-forward neural networks when inpu...

Lipschitz Bound Analysis of Neural Networks

Lipschitz Bound Estimation is an effective method of regularizing deep n...

Code Repositories


Implementation of the CLIP Algorithm

view repo