The Many Faces of 1-Lipschitz Neural Networks

04/11/2021
by   Louis Béthune, et al.
0

Lipschitz constrained models have been used to solve specifics deep learning problems such as the estimation of Wasserstein distance for GAN, or the training of neural networks robust to adversarial attacks. Regardless the novel and effective algorithms to build such 1-Lipschitz networks, their usage remains marginal, and they are commonly considered as less expressive and less able to fit properly the data than their unconstrained counterpart. The goal of the paper is to demonstrate that, despite being empirically harder to train, 1-Lipschitz neural networks are theoretically better grounded than unconstrained ones when it comes to classification. To achieve that we recall some results about 1-Lipschitz function in the scope of deep learning and we extend and illustrate them to derive general properties for classification. First, we show that 1-Lipschitz neural network can fit arbitrarily difficult frontier making them as expressive as classical ones. When minimizing the log loss, we prove that the optimization problem under Lipschitz constraint is well posed and have a minimum, whereas regular neural networks can diverge even on remarkably simple situations. Then, we study the link between classification with 1-Lipschitz network and optimal transport thanks to regularized versions of Kantorovich-Rubinstein duality theory. Last, we derive preliminary bounds on their VC dimension.

READ FULL TEXT
research
10/28/2022

Improving Lipschitz-Constrained Neural Networks by Learning Activation Functions

Lipschitz-constrained neural networks have several advantages compared t...
research
06/11/2020

Achieving robustness in classification using optimal transport with hinge regularization

We propose a new framework for robust binary classification, with Deep N...
research
06/09/2020

Approximating Lipschitz continuous functions with GroupSort neural networks

Recent advances in adversarial attacks and Wasserstein GANs have advocat...
research
09/07/2020

System Identification Through Lipschitz Regularized Deep Neural Networks

In this paper we use neural networks to learn governing equations from d...
research
10/05/2022

Dynamical systems' based neural networks

Neural networks have gained much interest because of their effectiveness...
research
11/13/2018

Sorting out Lipschitz function approximation

Training neural networks subject to a Lipschitz constraint is useful for...
research
05/21/2018

Measuring and regularizing networks in function space

Neural network optimization is often conceptualized as optimizing parame...

Please sign up or login with your details

Forgot password? Click here to reset