Lipschitz regularity of deep neural networks: analysis and efficient estimation

05/28/2018
by   Kevin Scaman, et al.
0

Deep neural networks are notorious for being sensitive to small well-chosen perturbations, and estimating the regularity of such architectures is of utmost importance for safe and robust practical applications. In this paper, we investigate one of the key characteristics to assess the regularity of such methods: the Lipschitz constant of deep learning architectures. First, we show that, even for two layer neural networks, the exact computation of this quantity is NP-hard and state-of-art methods may significantly overestimate it. Then, we both extend and improve previous estimation methods by providing AutoLip, the first generic algorithm for upper bounding the Lipschitz constant of any automatically differentiable function. We provide a power method algorithm working with automatic differentiation, allowing efficient computations even on large convolutions. Second, for sequential neural networks, we propose an improved algorithm named SeqLip that takes advantage of the linear computation graph to split the computation per pair of consecutive layers. Third we propose heuristics on SeqLip in order to tackle very large networks. Our experiments show that SeqLip can significantly improve on the existing upper bounds.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/15/2020

Fast Accurate Method for Bounding the Singular Values of Convolutional Layers with Application to Lipschitz Regularization

This paper tackles the problem of Lipschitz regularization of Convolutio...
research
04/18/2020

Lipschitz constant estimation of Neural Networks via sparse polynomial optimization

We introduce LiPopt, a polynomial optimization framework for computing i...
research
07/06/2021

Provable Lipschitz Certification for Generative Models

We present a scalable technique for upper bounding the Lipschitz constan...
research
10/13/2022

Efficiently Computing Local Lipschitz Constants of Neural Networks via Bound Propagation

Lipschitz constants are connected to many properties of neural networks,...
research
03/23/2021

CLIP: Cheap Lipschitz Training of Neural Networks

Despite the large success of deep neural networks (DNN) in recent years,...
research
05/25/2023

DP-SGD Without Clipping: The Lipschitz Neural Network Way

State-of-the-art approaches for training Differentially Private (DP) Dee...
research
02/10/2021

On the Regularity of Attention

Attention is a powerful component of modern neural networks across a wid...

Please sign up or login with your details

Forgot password? Click here to reset