DeepAI AI Chat
Log In Sign Up

Lagrangian Decomposition for Neural Network Verification

by   Rudy Bunel, et al.

A fundamental component of neural network verification is the computation of bounds on the values their outputs can take. Previous methods have either used off-the-shelf solvers, discarding the problem structure, or relaxed the problem even further, making the bounds unnecessarily loose. We propose a novel approach based on Lagrangian Decomposition. Our formulation admits an efficient supergradient ascent algorithm, as well as an improved proximal algorithm. Both the algorithms offer three advantages: (i) they yield bounds that are provably at least as tight as previous dual algorithms relying on Lagrangian relaxations; (ii) they are based on operations analogous to forward/backward pass of neural networks layers and are therefore easily parallelizable, amenable to GPU implementation and able to take advantage of the convolutional structure of problems; and (iii) they allow for anytime stopping while still providing valid bounds. Empirically, we show that we obtain bounds comparable with off-the-shelf solvers in a fraction of their running time, and obtain tighter bounds in the same time as previous dual algorithms. This results in an overall speed-up when employing the bounds for formal verification.


page 1

page 2

page 3

page 4


Scaling the Convex Barrier with Active Sets

Tight and efficient neural network bounding is of critical importance fo...

Improved Branch and Bound for Neural Network Verification via Lagrangian Decomposition

We improve the scalability of Branch and Bound (BaB) algorithms for form...

Zonotope Domains for Lagrangian Neural Network Verification

Neural network verification aims to provide provable bounds for the outp...

A Dual Approach to Scalable Verification of Deep Networks

This paper addresses the problem of formally verifying desirable propert...

Neural Network Branch-and-Bound for Neural Network Verification

Many available formal verification methods have been shown to be instanc...

A Dual Ascent Framework for Lagrangean Decomposition of Combinatorial Problems

We propose a general dual ascent framework for Lagrangean decomposition ...

Convex Bounds on the Softmax Function with Applications to Robustness Verification

The softmax function is a ubiquitous component at the output of neural n...

Code Repositories


Scaling the convex barrier for piecewise-linear neural network verification

view repo


Dual iterative algorithms for Neural Network output bounds computations

view repo