Efficiently Computing Local Lipschitz Constants of Neural Networks via Bound Propagation

10/13/2022
by   Zhouxing Shi, et al.
0

Lipschitz constants are connected to many properties of neural networks, such as robustness, fairness, and generalization. Existing methods for computing Lipschitz constants either produce relatively loose upper bounds or are limited to small networks. In this paper, we develop an efficient framework for computing the ℓ_∞ local Lipschitz constant of a neural network by tightly upper bounding the norm of Clarke Jacobian via linear bound propagation. We formulate the computation of local Lipschitz constants with a linear bound propagation process on a high-order backward graph induced by the chain rule of Clarke Jacobian. To enable linear bound propagation, we derive tight linear relaxations for specific nonlinearities in Clarke Jacobian. This formulate unifies existing ad-hoc approaches such as RecurJac, which can be seen as a special case of ours with weaker relaxations. The bound propagation framework also allows us to easily borrow the popular Branch-and-Bound (BaB) approach from neural network verification to further tighten Lipschitz constants. Experiments show that on tiny models, our method produces comparable bounds compared to exact methods that cannot scale to slightly larger models; on larger models, our method efficiently produces tighter results than existing relaxed or naive methods, and our method scales to much larger practical models that previous works could not handle. We also demonstrate an application on provable monotonicity analysis. Code is available at https://github.com/shizhouxing/Local-Lipschitz-Constants.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/22/2019

Bounding Singular Values of Convolution Layers

In deep neural networks, the spectral norm of the Jacobian of a layer bo...
research
10/28/2018

RecurJac: An Efficient Recursive Algorithm for Bounding Jacobian Matrix of Neural Networks and Its Applications

The Jacobian matrix (or the gradient for single-output networks) is dire...
research
07/01/2021

Boosting Certified ℓ_∞ Robustness with EMA Method and Ensemble Model

The neural network with 1-Lipschitz property based on ℓ_∞-dist neuron ha...
research
08/08/2022

Robust Training and Verification of Implicit Neural Networks: A Non-Euclidean Contractive Approach

This paper proposes a theoretical and computational framework for traini...
research
06/08/2020

The Lipschitz Constant of Self-Attention

Lipschitz constants of neural networks have been explored in various con...
research
05/28/2018

Lipschitz regularity of deep neural networks: analysis and efficient estimation

Deep neural networks are notorious for being sensitive to small well-cho...
research
08/16/2022

On Optimizing Back-Substitution Methods for Neural Network Verification

With the increasing application of deep learning in mission-critical sys...

Please sign up or login with your details

Forgot password? Click here to reset