LipBaB: Computing exact Lipschitz constant of ReLU networks

05/12/2021
by   Aritra Bhowmick, et al.
0

The Lipschitz constant of neural networks plays an important role in several contexts of deep learning ranging from robustness certification and regularization to stability analysis of systems with neural network controllers. Obtaining tight bounds of the Lipschitz constant is therefore important. We introduce LipBaB, a branch and bound framework to compute certified bounds of the local Lipschitz constant of deep neural networks with ReLU activation functions up to any desired precision. We achieve this by bounding the norm of the Jacobians, corresponding to different activation patterns of the network caused within the input domain. Our algorithm can provide provably exact computation of the Lipschitz constant for any p-norm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/02/2021

Training Certifiably Robust Neural Networks with Efficient Local Lipschitz Bounds

Certified robustness is a desirable property for deep neural networks in...
research
02/10/2020

Polynomial Optimization for Bounding Lipschitz Constants of Deep Networks

The Lipschitz constant of a network plays an important role in many appl...
research
06/12/2019

Efficient and Accurate Estimation of Lipschitz Constants for Deep Neural Networks

Tight estimation of the Lipschitz constant for deep neural networks (DNN...
research
09/29/2020

Lipschitz neural networks are dense in the set of all Lipschitz functions

This note shows that, for a fixed Lipschitz constant L > 0, one layer ne...
research
03/02/2020

Exactly Computing the Local Lipschitz Constant of ReLU Networks

The Lipschitz constant of a neural network is a useful metric for provab...
research
01/17/2020

Deep Neural Networks with Trainable Activations and Controlled Lipschitz Constant

We introduce a variational framework to learn the activation functions o...
research
10/28/2018

RecurJac: An Efficient Recursive Algorithm for Bounding Jacobian Matrix of Neural Networks and Its Applications

The Jacobian matrix (or the gradient for single-output networks) is dire...

Please sign up or login with your details

Forgot password? Click here to reset