Some Fundamental Aspects about Lipschitz Continuity of Neural Network Functions

02/21/2023
by   Grigory Khromov, et al.
0

Lipschitz continuity is a simple yet pivotal functional property of any predictive model that lies at the core of its robustness, generalisation, and adversarial vulnerability. Our aim is to thoroughly investigate and characterise the Lipschitz behaviour of the functions learned via neural networks. Despite the significant tightening of the bounds in the recent years, precisely estimating the Lipschitz constant continues to be a practical challenge and tight theoretical analyses, similarly, remain intractable. Therefore, we shift our perspective and instead attempt to uncover insights about the nature of Lipschitz constant of neural networks functions – by relying on the simplest and most general upper and lower bounds. We carry out an empirical investigation in a range of different settings (architectures, losses, optimisers, label noise, etc.), which reveals several fundamental and intriguing traits of the Lipschitz continuity of neural networks functions, In particular, we identify a remarkable double descent trend in both upper and lower bounds to the Lipschitz constant which tightly aligns with the typical double descent trend in the test loss.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/12/2018

Regularisation of Neural Networks by Enforcing Lipschitz Continuity

We investigate the effect of explicitly enforcing the Lipschitz continui...
research
07/13/2022

Lipschitz Continuity Retained Binary Neural Network

Relying on the premise that the performance of a binary neural network c...
research
06/09/2020

Approximating Lipschitz continuous functions with GroupSort neural networks

Recent advances in adversarial attacks and Wasserstein GANs have advocat...
research
12/10/2020

Certifying Incremental Quadratic Constraints for Neural Networks

Abstracting neural networks with constraints they impose on their inputs...
research
04/18/2020

Lipschitz constant estimation of Neural Networks via sparse polynomial optimization

We introduce LiPopt, a polynomial optimization framework for computing i...
research
03/02/2022

A Quantitative Geometric Approach to Neural Network Smoothness

Fast and precise Lipschitz constant estimation of neural networks is an ...
research
10/29/2020

Off-Policy Interval Estimation with Lipschitz Value Iteration

Off-policy evaluation provides an essential tool for evaluating the effe...

Please sign up or login with your details

Forgot password? Click here to reset