Preventing Gradient Attenuation in Lipschitz Constrained Convolutional Networks

11/03/2019
by   Qiyang Li, et al.
4

Lipschitz constraints under L2 norm on deep neural networks are useful for provable adversarial robustness bounds, stable training, and Wasserstein distance estimation. While heuristic approaches such as the gradient penalty have seen much practical success, it is challenging to achieve similar practical performance while provably enforcing a Lipschitz constraint. In principle, one can design Lipschitz constrained architectures using the composition property of Lipschitz functions, but Anil et al. recently identified a key obstacle to this approach: gradient norm attenuation. They showed how to circumvent this problem in the case of fully connected networks by designing each layer to be gradient norm preserving. We extend their approach to train scalable, expressive, provably Lipschitz convolutional networks. In particular, we present the Block Convolution Orthogonal Parameterization (BCOP), an expressive parameterization of orthogonal convolution operations. We show that even though the space of orthogonal convolutions is disconnected, the largest connected component of BCOP with 2n channels can represent arbitrary BCOP convolutions over n channels. Our BCOP parameterization allows us to train large convolutional networks with provable Lipschitz bounds. Empirically, we find that it is competitive with existing approaches to provable adversarial robustness and Wasserstein distance estimation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/13/2018

Sorting out Lipschitz function approximation

Training neural networks subject to a Lipschitz constraint is useful for...
research
03/16/2022

On the sensitivity of pose estimation neural networks: rotation parameterizations, Lipschitz constants, and provable bounds

In this paper, we approach the task of determining sensitivity bounds fo...
research
05/24/2021

Skew Orthogonal Convolutions

Training convolutional neural networks with a Lipschitz constraint under...
research
01/27/2023

Direct Parameterization of Lipschitz-Bounded Deep Networks

This paper introduces a new parameterization of deep neural networks (bo...
research
08/05/2021

Householder Activations for Provable Robustness against Adversarial Attacks

Training convolutional neural networks (CNNs) with a strict Lipschitz co...
research
06/05/2020

Lipschitz Bounds and Provably Robust Training by Laplacian Smoothing

In this work we propose a graph-based learning framework to train models...
research
07/06/2021

Provable Lipschitz Certification for Generative Models

We present a scalable technique for upper bounding the Lipschitz constan...

Please sign up or login with your details

Forgot password? Click here to reset