Exactly Computing the Local Lipschitz Constant of ReLU Networks

03/02/2020
by   Matt Jordan, et al.
24

The Lipschitz constant of a neural network is a useful metric for provable robustness and generalization. We present a novel analytic result which relates gradient norms to Lipschitz constants for nondifferentiable functions. Next we prove hardness and inapproximability results for computing the local Lipschitz constant of ReLU neural networks. We develop a mixed-integer programming formulation to exactly compute the local Lipschitz constant for scalar and vector-valued networks. Finally, we apply our technique on networks trained on synthetic datasets and MNIST, drawing observations about the tightness of competing Lipschitz estimators and the effects of regularized training on Lipschitz constants.

READ FULL TEXT
research
05/12/2021

LipBaB: Computing exact Lipschitz constant of ReLU networks

The Lipschitz constant of neural networks plays an important role in sev...
research
10/25/2019

Infinity-Laplacians on Scalar- and Vector-Valued Functions and Optimal Lipschitz Extensions on Graphs

Extending functions from boundary values plays an important role in vari...
research
04/18/2020

Lipschitz constant estimation of Neural Networks via sparse polynomial optimization

We introduce LiPopt, a polynomial optimization framework for computing i...
research
02/10/2020

Polynomial Optimization for Bounding Lipschitz Constants of Deep Networks

The Lipschitz constant of a network plays an important role in many appl...
research
03/02/2022

A Quantitative Geometric Approach to Neural Network Smoothness

Fast and precise Lipschitz constant estimation of neural networks is an ...
research
12/27/2021

Sparsest Univariate Learning Models Under Lipschitz Constraint

Beside the minimization of the prediction error, two of the most desirab...

Please sign up or login with your details

Forgot password? Click here to reset