Towards a regularity theory for ReLU networks -- chain rule and global error estimates

05/13/2019
by   Julius Berner, et al.
0

Although for neural networks with locally Lipschitz continuous activation functions the classical derivative exists almost everywhere, the standard chain rule is in general not applicable. We will consider a way of introducing a derivative for neural networks that admits a chain rule, which is both rigorous and easy to work with. In addition we will present a method of converting approximation results on bounded domains to global (pointwise) estimates. This can be used to extend known neural network approximation theory to include the study of regularity properties. Of particular interest is the application to neural networks with ReLU activation function, where it contributes to the understanding of the success of deep learning methods for high-dimensional partial differential equations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/21/2019

Error bounds for approximations with deep ReLU neural networks in W^s,p norms

We analyze approximation rates of deep ReLU neural networks for Sobolev-...
research
12/15/2020

Approximation of BV functions by neural networks: A regularity theory approach

In this paper we are concerned with the approximation of functions by si...
research
11/15/2021

Reachability analysis of neural networks using mixed monotonicity

This paper presents a new reachability analysis tool to compute an inter...
research
02/23/2023

Testing Stationarity Concepts for ReLU Networks: Hardness, Regularity, and Robust Algorithms

We study the computational problem of the stationarity test for the empi...
research
01/17/2020

Deep Neural Networks with Trainable Activations and Controlled Lipschitz Constant

We introduce a variational framework to learn the activation functions o...
research
10/28/2022

Improving Lipschitz-Constrained Neural Networks by Learning Activation Functions

Lipschitz-constrained neural networks have several advantages compared t...
research
06/13/2019

Neural Networks on Groups

Recent work on neural networks has shown that allowing them to build int...

Please sign up or login with your details

Forgot password? Click here to reset