Approximation of Lipschitz Functions using Deep Spline Neural Networks

04/13/2022
by   Sebastian Neumayer, et al.
0

Lipschitz-constrained neural networks have many applications in machine learning. Since designing and training expressive Lipschitz-constrained networks is very challenging, there is a need for improved methods and a better theoretical understanding. Unfortunately, it turns out that ReLU networks have provable disadvantages in this setting. Hence, we propose to use learnable spline activation functions with at least 3 linear regions instead. We prove that this choice is optimal among all component-wise 1-Lipschitz activation functions in the sense that no other weight constrained architecture can approximate a larger class of functions. Additionally, this choice is at least as expressive as the recently introduced non component-wise Groupsort activation function for spectral-norm-constrained weights. Previously published numerical results support our theoretical findings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/28/2022

Improving Lipschitz-Constrained Neural Networks by Learning Activation Functions

Lipschitz-constrained neural networks have several advantages compared t...
research
11/13/2018

Sorting out Lipschitz function approximation

Training neural networks subject to a Lipschitz constraint is useful for...
research
11/17/2022

On the Sample Complexity of Two-Layer Networks: Lipschitz vs. Element-Wise Lipschitz Activation

We investigate the sample complexity of bounded two-layer neural network...
research
06/09/2020

Approximating Lipschitz continuous functions with GroupSort neural networks

Recent advances in adversarial attacks and Wasserstein GANs have advocat...
research
04/09/2019

Universal Lipschitz Approximation in Bounded Depth Neural Networks

Adversarial attacks against machine learning models are a rather hefty o...
research
06/29/2023

Designing Stable Neural Networks using Convex Analysis and ODEs

Motivated by classical work on the numerical integration of ordinary dif...
research
04/06/2018

A comparison of deep networks with ReLU activation function and linear spline-type methods

Deep neural networks (DNNs) generate much richer function spaces than sh...

Please sign up or login with your details

Forgot password? Click here to reset