Polynomial Optimization for Bounding Lipschitz Constants of Deep Networks

02/10/2020
by   Tong Chen, et al.
0

The Lipschitz constant of a network plays an important role in many applications of deep learning, such as robustness certification and Wasserstein Generative Adversarial Network. We introduce a semidefinite programming hierarchy to estimate the global and local Lipschitz constant of a multiple layer deep neural network. The novelty is to combine a polynomial lifting for ReLU functions derivatives with a weak generalization of Putinar's positivity certificate. This idea could also apply to other, nearly sparse, polynomial optimization problems in machine learning. We empirically demonstrate that our method not only runs faster than state-of-the-art linear programming based method, but also provides sharper bounds.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/12/2021

LipBaB: Computing exact Lipschitz constant of ReLU networks

The Lipschitz constant of neural networks plays an important role in sev...
research
04/18/2020

Lipschitz constant estimation of Neural Networks via sparse polynomial optimization

We introduce LiPopt, a polynomial optimization framework for computing i...
research
03/02/2020

Exactly Computing the Local Lipschitz Constant of ReLU Networks

The Lipschitz constant of a neural network is a useful metric for provab...
research
04/29/2021

Analytical bounds on the local Lipschitz constants of ReLU networks

In this paper, we determine analytical upper bounds on the local Lipschi...
research
06/08/2020

The Lipschitz Constant of Self-Attention

Lipschitz constants of neural networks have been explored in various con...
research
10/28/2018

RecurJac: An Efficient Recursive Algorithm for Bounding Jacobian Matrix of Neural Networks and Its Applications

The Jacobian matrix (or the gradient for single-output networks) is dire...
research
04/25/2018

Towards Fast Computation of Certified Robustness for ReLU Networks

Verifying the robustness property of a general Rectified Linear Unit (Re...

Please sign up or login with your details

Forgot password? Click here to reset