Deep Operator Network Approximation Rates for Lipschitz Operators

07/19/2023
βˆ™
by   Christoph Schwab, et al.
βˆ™
0
βˆ™

We establish universality and expression rate bounds for a class of neural Deep Operator Networks (DON) emulating Lipschitz (or HΓΆlder) continuous maps 𝒒:𝒳→𝒴 between (subsets of) separable Hilbert spaces 𝒳, 𝒴. The DON architecture considered uses linear encoders β„° and decoders π’Ÿ via (biorthogonal) Riesz bases of 𝒳, 𝒴, and an approximator network of an infinite-dimensional, parametric coordinate map that is Lipschitz continuous on the sequence space β„“^2(β„•). Unlike previous works ([Herrmann, Schwab and Zech: Neural and Spectral operator surrogates: construction and expression rate bounds, SAM Report, 2022], [Marcati and Schwab: Exponential Convergence of Deep Operator Networks for Elliptic Partial Differential Equations, SAM Report, 2022]), which required for example 𝒒 to be holomorphic, the present expression rate results require mere Lipschitz (or HΓΆlder) continuity of 𝒒. Key in the proof of the present expression rate bounds is the use of either super-expressive activations (e.g. [Yarotski: Elementary superexpressive activations, Int. Conf. on ML, 2021], [Shen, Yang and Zhang: Neural network approximation: Three hidden layers are enough, Neural Networks, 2021], and the references there) which are inspired by the Kolmogorov superposition theorem, or of nonstandard NN architectures with standard (ReLU) activations as recently proposed in [Zhang, Shen and Yang: Neural Network Architecture Beyond Width and Depth, Adv. in Neural Inf. Proc. Sys., 2022]. We illustrate the abstract results by approximation rate bounds for emulation of a) solution operators for parametric elliptic variational inequalities, and b) Lipschitz maps of Hilbert-Schmidt operators.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
βˆ™ 07/11/2022

Neural and gpc operator surrogates: construction and expression rate bounds

Approximation rates are analyzed for deep surrogates of maps between inf...
research
βˆ™ 06/22/2020

Deep Network Approximation with Discrepancy Being Reciprocal of Width to Power of Depth

A new network with super approximation power is introduced. This network...
research
βˆ™ 08/18/2023

On the Approximation of Bi-Lipschitz Maps by Invertible Neural Networks

Invertible neural networks (INNs) represent an important class of deep n...
research
βˆ™ 05/19/2022

Neural Network Architecture Beyond Width and Depth

This paper proposes a new neural network architecture by introducing an ...
research
βˆ™ 10/23/2020

Exponential ReLU Neural Network Approximation Rates for Point and Edge Singularities

We prove exponential expressivity with stable ReLU Neural Networks (ReLU...
research
βˆ™ 06/29/2023

Designing Stable Neural Networks using Convex Analysis and ODEs

Motivated by classical work on the numerical integration of ordinary dif...
research
βˆ™ 09/17/2018

Projective Splitting with Forward Steps only Requires Continuity

A recent innovation in projective splitting algorithms for monotone oper...

Please sign up or login with your details

Forgot password? Click here to reset