Efficient and Accurate Estimation of Lipschitz Constants for Deep Neural Networks

06/12/2019
by   Mahyar Fazlyab, et al.
9

Tight estimation of the Lipschitz constant for deep neural networks (DNNs) is useful in many applications ranging from robustness certification of classifiers to stability analysis of closed-loop systems with reinforcement learning controllers. Existing methods in the literature for estimating the Lipschitz constant suffer from either lack of accuracy or poor scalability. In this paper, we present a convex optimization framework to compute guaranteed upper bounds on the Lipschitz constant of DNNs both accurately and efficiently. Our main idea is to interpret activation functions as gradients of convex potential functions. Hence, they satisfy certain properties that can be described by quadratic constraints. This particular description allows us to pose the Lipschitz constant estimation problem as a semidefinite program (SDP). The resulting SDP can be adapted to increase either the estimation accuracy (by capturing the interaction between activation functions of different layers) or scalability (by decomposition and parallel implementation). We illustrate the utility of our approach with a variety of experiments on randomly generated networks and on classifiers trained on the MNIST and Iris datasets. In particular, we experimentally demonstrate that our Lipschitz bounds are the most accurate compared to those in the literature. We also study the impact of adversarial training methods on the Lipschitz bounds of the resulting classifiers and show that our bounds can be used to efficiently provide robustness guarantees.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/12/2021

LipBaB: Computing exact Lipschitz constant of ReLU networks

The Lipschitz constant of neural networks plays an important role in sev...
research
12/10/2020

Certifying Incremental Quadratic Constraints for Neural Networks

Abstracting neural networks with constraints they impose on their inputs...
research
01/03/2022

Neural network training under semidefinite constraints

This paper is concerned with the training of neural networks (NNs) under...
research
12/24/2019

An Analysis of the Expressiveness of Deep Neural Network Architectures Based on Their Lipschitz Constants

Deep neural networks (DNNs) have emerged as a popular mathematical tool ...
research
10/05/2020

Lipschitz Bounded Equilibrium Networks

This paper introduces new parameterizations of equilibrium neural networ...
research
04/13/2018

Representing smooth functions as compositions of near-identity functions with implications for deep network optimization

We show that any smooth bi-Lipschitz h can be represented exactly as a c...
research
03/23/2021

CLIP: Cheap Lipschitz Training of Neural Networks

Despite the large success of deep neural networks (DNN) in recent years,...

Please sign up or login with your details

Forgot password? Click here to reset