Lipschitz constant estimation for 1D convolutional neural networks

11/28/2022
by   Patricia Pauli, et al.
0

In this work, we propose a dissipativity-based method for Lipschitz constant estimation of 1D convolutional neural networks (CNNs). In particular, we analyze the dissipativity properties of convolutional, pooling, and fully connected layers making use of incremental quadratic constraints for nonlinear activation functions and pooling operations. The Lipschitz constant of the concatenation of these mappings is then estimated by solving a semidefinite program which we derive from dissipativity theory. To make our method as efficient as possible, we take the structure of convolutional layers into account realizing these finite impulse response filters as causal dynamical systems in state space and carrying out the dissipativity analysis for the state space realizations. The examples we provide show that our Lipschitz bounds are advantageous in terms of accuracy and scalability.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/06/2023

Convolutional Neural Networks as 2-D systems

This paper introduces a novel representation of convolutional Neural Net...
research
07/14/2022

Lipschitz Bound Analysis of Neural Networks

Lipschitz Bound Estimation is an effective method of regularizing deep n...
research
11/16/2013

Signal Recovery from Pooling Representations

In this work we compute lower Lipschitz bounds of ℓ_p pooling operators ...
research
08/04/2018

On Lipschitz Bounds of General Convolutional Neural Networks

Many convolutional neural networks (CNNs) have a feed-forward structure....
research
05/29/2022

Continuous Generative Neural Networks

In this work, we present and study Continuous Generative Neural Networks...
research
04/13/2018

Representing smooth functions as compositions of near-identity functions with implications for deep network optimization

We show that any smooth bi-Lipschitz h can be represented exactly as a c...
research
12/10/2020

Certifying Incremental Quadratic Constraints for Neural Networks

Abstracting neural networks with constraints they impose on their inputs...

Please sign up or login with your details

Forgot password? Click here to reset