Certifying Incremental Quadratic Constraints for Neural Networks

12/10/2020
by   Navid Hashemi, et al.
0

Abstracting neural networks with constraints they impose on their inputs and outputs can be very useful in the analysis of neural network classifiers and to derive optimization-based algorithms for certification of stability and robustness of feedback systems involving neural networks. In this paper, we propose a convex program, in the form of a Linear Matrix Inequality (LMI), to certify incremental quadratic constraints on the map of neural networks over a region of interest. These certificates can capture several useful properties such as (local) Lipschitz continuity, one-sided Lipschitz continuity, invertibility, and contraction. We illustrate the utility of our approach in two different settings. First, we develop a semidefinite program to compute guaranteed and sharp upper bounds on the local Lipschitz constant of neural networks and illustrate the results on random networks as well as networks trained on MNIST. Second, we consider a linear time-invariant system in feedback with an approximate model predictive controller parameterized by a neural network. We then turn the stability analysis into a semidefinite feasibility program and estimate an ellipsoidal invariant set for the closed-loop system.

READ FULL TEXT

page 1

page 3

page 4

page 7

page 8

page 9

page 11

page 12

research
06/12/2019

Efficient and Accurate Estimation of Lipschitz Constants for Deep Neural Networks

Tight estimation of the Lipschitz constant for deep neural networks (DNN...
research
09/13/2021

Robust Stability of Neural-Network Controlled Nonlinear Systems with Parametric Variability

Stability certification and identification of the stabilizable operating...
research
04/18/2020

Lipschitz constant estimation of Neural Networks via sparse polynomial optimization

We introduce LiPopt, a polynomial optimization framework for computing i...
research
11/28/2022

Lipschitz constant estimation for 1D convolutional neural networks

In this work, we propose a dissipativity-based method for Lipschitz cons...
research
11/01/2022

ReachLipBnB: A branch-and-bound method for reachability analysis of neural autonomous systems using Lipschitz bounds

We propose a novel Branch-and-Bound method for reachability analysis of ...
research
02/21/2023

Some Fundamental Aspects about Lipschitz Continuity of Neural Network Functions

Lipschitz continuity is a simple yet pivotal functional property of any ...
research
01/03/2022

Neural network training under semidefinite constraints

This paper is concerned with the training of neural networks (NNs) under...

Please sign up or login with your details

Forgot password? Click here to reset