Some Fundamental Aspects about Lipschitz Continuity of Neural Network Functions

02/21/2023
by   Grigory Khromov, et al.
0

Lipschitz continuity is a simple yet pivotal functional property of any predictive model that lies at the core of its robustness, generalisation, and adversarial vulnerability. Our aim is to thoroughly investigate and characterise the Lipschitz behaviour of the functions learned via neural networks. Despite the significant tightening of the bounds in the recent years, precisely estimating the Lipschitz constant continues to be a practical challenge and tight theoretical analyses, similarly, remain intractable. Therefore, we shift our perspective and instead attempt to uncover insights about the nature of Lipschitz constant of neural networks functions – by relying on the simplest and most general upper and lower bounds. We carry out an empirical investigation in a range of different settings (architectures, losses, optimisers, label noise, etc.), which reveals several fundamental and intriguing traits of the Lipschitz continuity of neural networks functions, In particular, we identify a remarkable double descent trend in both upper and lower bounds to the Lipschitz constant which tightly aligns with the typical double descent trend in the test loss.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset