Lipschitz-bounded 1D convolutional neural networks using the Cayley transform and the controllability Gramian

03/20/2023
by   Patricia Pauli, et al.
0

We establish a layer-wise parameterization for 1D convolutional neural networks (CNNs) with built-in end-to-end robustness guarantees. Herein, we use the Lipschitz constant of the input-output mapping characterized by a CNN as a robustness measure. We base our parameterization on the Cayley transform that parameterizes orthogonal matrices and the controllability Gramian for the state space representation of the convolutional layers. The proposed parameterization by design fulfills linear matrix inequalities that are sufficient for Lipschitz continuity of the CNN, which further enables unconstrained training of Lipschitz-bounded 1D CNNs. Finally, we train Lipschitz-bounded 1D CNNs for the classification of heart arrythmia data and show their improved robustness.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/04/2018

On Lipschitz Bounds of General Convolutional Neural Networks

Many convolutional neural networks (CNNs) have a feed-forward structure....
research
01/27/2023

Direct Parameterization of Lipschitz-Bounded Deep Networks

This paper introduces a new parameterization of deep neural networks (bo...
research
03/06/2023

Convolutional Neural Networks as 2-D systems

This paper introduces a novel representation of convolutional Neural Net...
research
10/20/2022

LOT: Layer-wise Orthogonal Training on Improving l2 Certified Robustness

Recent studies show that training deep neural networks (DNNs) with Lipsc...
research
03/16/2022

On the sensitivity of pose estimation neural networks: rotation parameterizations, Lipschitz constants, and provable bounds

In this paper, we approach the task of determining sensitivity bounds fo...
research
02/16/2021

A Law of Robustness for Weight-bounded Neural Networks

Robustness of deep neural networks against adversarial perturbations is ...
research
07/13/2022

Lipschitz Continuity Retained Binary Neural Network

Relying on the premise that the performance of a binary neural network c...

Please sign up or login with your details

Forgot password? Click here to reset