Deep Operator Network Approximation Rates for Lipschitz Operators

07/19/2023
βˆ™
by   Christoph Schwab, et al.
βˆ™
0
βˆ™

We establish universality and expression rate bounds for a class of neural Deep Operator Networks (DON) emulating Lipschitz (or HΓΆlder) continuous maps 𝒒:𝒳→𝒴 between (subsets of) separable Hilbert spaces 𝒳, 𝒴. The DON architecture considered uses linear encoders β„° and decoders π’Ÿ via (biorthogonal) Riesz bases of 𝒳, 𝒴, and an approximator network of an infinite-dimensional, parametric coordinate map that is Lipschitz continuous on the sequence space β„“^2(β„•). Unlike previous works ([Herrmann, Schwab and Zech: Neural and Spectral operator surrogates: construction and expression rate bounds, SAM Report, 2022], [Marcati and Schwab: Exponential Convergence of Deep Operator Networks for Elliptic Partial Differential Equations, SAM Report, 2022]), which required for example 𝒒 to be holomorphic, the present expression rate results require mere Lipschitz (or HΓΆlder) continuity of 𝒒. Key in the proof of the present expression rate bounds is the use of either super-expressive activations (e.g. [Yarotski: Elementary superexpressive activations, Int. Conf. on ML, 2021], [Shen, Yang and Zhang: Neural network approximation: Three hidden layers are enough, Neural Networks, 2021], and the references there) which are inspired by the Kolmogorov superposition theorem, or of nonstandard NN architectures with standard (ReLU) activations as recently proposed in [Zhang, Shen and Yang: Neural Network Architecture Beyond Width and Depth, Adv. in Neural Inf. Proc. Sys., 2022]. We illustrate the abstract results by approximation rate bounds for emulation of a) solution operators for parametric elliptic variational inequalities, and b) Lipschitz maps of Hilbert-Schmidt operators.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset