A Unified Algebraic Perspective on Lipschitz Neural Networks

03/06/2023
by   Alexandre Araujo, et al.
0

Important research efforts have focused on the design and training of neural networks with a controlled Lipschitz constant. The goal is to increase and sometimes guarantee the robustness against adversarial attacks. Recent promising techniques draw inspirations from different backgrounds to design 1-Lipschitz neural networks, just to name a few: convex potential layers derive from the discretization of continuous dynamical systems, Almost-Orthogonal-Layer proposes a tailored method for matrix rescaling. However, it is today important to consider the recent and promising contributions in the field under a common theoretical lens to better design new and improved layers. This paper introduces a novel algebraic perspective unifying various types of 1-Lipschitz neural networks, including the ones previously mentioned, along with methods based on orthogonality and spectral methods. Interestingly, we show that many existing techniques can be derived and generalized via finding analytical solutions of a common semidefinite programming (SDP) condition. We also prove that AOL biases the scaled weight to the ones which are close to the set of orthogonal matrices in a certain mathematical manner. Moreover, our algebraic condition, combined with the Gershgorin circle theorem, readily leads to new and diverse parameterizations for 1-Lipschitz network layers. Our approach, called SDP-based Lipschitz Layers (SLL), allows us to design non-trivial yet efficient generalization of convex potential layers. Finally, the comprehensive set of experiments on image classification shows that SLLs outperform previous approaches on certified robust accuracy. Code is available at https://github.com/araujoalexandre/Lipschitz-SLL-Networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/05/2022

Almost-Orthogonal Layers for Efficient General-Purpose Lipschitz Networks

It is a highly desirable property for deep networks to be robust against...
research
11/15/2022

Improved techniques for deterministic l2 robustness

Training convolutional neural networks (CNNs) with a strict 1-Lipschitz ...
research
10/25/2021

Scalable Lipschitz Residual Networks with Convex Potential Flows

The Lipschitz constant of neural networks has been established as a key ...
research
10/05/2022

Dynamical systems' based neural networks

Neural networks have gained much interest because of their effectiveness...
research
05/25/2023

Efficient Bound of Lipschitz Constant for Convolutional Layers by Gram Iteration

Since the control of the Lipschitz constant has a great impact on the tr...
research
10/28/2018

RecurJac: An Efficient Recursive Algorithm for Bounding Jacobian Matrix of Neural Networks and Its Applications

The Jacobian matrix (or the gradient for single-output networks) is dire...
research
10/05/2020

Lipschitz Bounded Equilibrium Networks

This paper introduces new parameterizations of equilibrium neural networ...

Please sign up or login with your details

Forgot password? Click here to reset