A Quantitative Geometric Approach to Neural Network Smoothness

03/02/2022
by   Zi Wang, et al.
2

Fast and precise Lipschitz constant estimation of neural networks is an important task for deep learning. Researchers have recently found an intrinsic trade-off between the accuracy and smoothness of neural networks, so training a network with a loose Lipschitz constant estimation imposes a strong regularization and can hurt the model accuracy significantly. In this work, we provide a unified theoretical framework, a quantitative geometric approach, to address the Lipschitz constant estimation. By adopting this framework, we can immediately obtain several theoretical results, including the computational hardness of Lipschitz constant estimation and its approximability. Furthermore, the quantitative geometric perspective can also provide some insights into recent empirical observations that techniques for one norm do not usually transfer to another one. We also implement the algorithms induced from this quantitative geometric approach in a tool GeoLIP. These algorithms are based on semidefinite programming (SDP). Our empirical evaluation demonstrates that GeoLIP is more scalable and precise than existing tools on Lipschitz constant estimation for ℓ_∞-perturbations. Furthermore, we also show its intricate relations with other recent SDP-based techniques, both theoretically and empirically. We believe that this unified quantitative geometric perspective can bring new insights and theoretical tools to the investigation of neural-network smoothness and robustness.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/18/2020

Lipschitz constant estimation of Neural Networks via sparse polynomial optimization

We introduce LiPopt, a polynomial optimization framework for computing i...
research
06/27/2022

Theoretical analysis of Adam using hyperparameters close to one without Lipschitz smoothness

Convergence and convergence rate analyses of adaptive methods, such as A...
research
03/02/2020

Exactly Computing the Local Lipschitz Constant of ReLU Networks

The Lipschitz constant of a neural network is a useful metric for provab...
research
10/04/2022

Rethinking Lipschitz Neural Networks for Certified L-infinity Robustness

Designing neural networks with bounded Lipschitz constant is a promising...
research
07/02/2020

Efficient Proximal Mapping of the 1-path-norm of Shallow Networks

We demonstrate two new important properties of the 1-path-norm of shallo...
research
02/16/2022

Learning Smooth Neural Functions via Lipschitz Regularization

Neural implicit fields have recently emerged as a useful representation ...
research
02/21/2023

Some Fundamental Aspects about Lipschitz Continuity of Neural Network Functions

Lipschitz continuity is a simple yet pivotal functional property of any ...

Please sign up or login with your details

Forgot password? Click here to reset