Enabling certification of verification-agnostic networks via memory-efficient semidefinite programming

by   Sumanth Dathathri, et al.

Convex relaxations have emerged as a promising approach for verifying desirable properties of neural networks like robustness to adversarial perturbations. Widely used Linear Programming (LP) relaxations only work well when networks are trained to facilitate verification. This precludes applications that involve verification-agnostic networks, i.e., networks not specially trained for verification. On the other hand, semidefinite programming (SDP) relaxations have successfully be applied to verification-agnostic networks, but do not currently scale beyond small networks due to poor time and space asymptotics. In this work, we propose a first-order dual SDP algorithm that (1) requires memory only linear in the total number of network activations, (2) only requires a fixed number of forward/backward passes through the network per iteration. By exploiting iterative eigenvector methods, we express all solver operations in terms of forward and backward passes through the network, enabling efficient use of hardware like GPUs/TPUs. For two verification-agnostic networks on MNIST and CIFAR-10, we significantly improve L-inf verified robust accuracy from 1 also demonstrate tight verification of a quadratic stability specification for the decoder of a variational autoencoder.


Safety Verification and Robustness Analysis of Neural Networks via Quadratic Constraints and Semidefinite Programming

Analyzing the robustness of neural networks against norm-bounded uncerta...

Beta-CROWN: Efficient Bound Propagation with Per-neuron Split Constraints for Complete and Incomplete Neural Network Verification

Recent works in neural network verification show that cheap incomplete v...

A Convex Relaxation Barrier to Tight Robust Verification of Neural Networks

Verification of neural networks enables us to gauge their robustness aga...

Fast and Complete: Enabling Complete Neural Network Verification with Rapid and Massively Parallel Incomplete Verifiers

Formal verification of neural networks (NNs) is a challenging and import...

A Convex Relaxation Barrier to Tight Robustness Verification of Neural Networks

Verification of neural networks enables us to gauge their robustness aga...

Parametric Chordal Sparsity for SDP-based Neural Network Verification

Many future technologies rely on neural networks, but verifying the corr...

Verified Self-Explaining Computation

Common programming tools, like compilers, debuggers, and IDEs, crucially...