Deep Operator Network Approximation Rates for Lipschitz Operators
We establish universality and expression rate bounds for a class of neural Deep Operator Networks (DON) emulating Lipschitz (or HΓΆlder) continuous maps π’:π³βπ΄ between (subsets of) separable Hilbert spaces π³, π΄. The DON architecture considered uses linear encoders β° and decoders π via (biorthogonal) Riesz bases of π³, π΄, and an approximator network of an infinite-dimensional, parametric coordinate map that is Lipschitz continuous on the sequence space β^2(β). Unlike previous works ([Herrmann, Schwab and Zech: Neural and Spectral operator surrogates: construction and expression rate bounds, SAM Report, 2022], [Marcati and Schwab: Exponential Convergence of Deep Operator Networks for Elliptic Partial Differential Equations, SAM Report, 2022]), which required for example π’ to be holomorphic, the present expression rate results require mere Lipschitz (or HΓΆlder) continuity of π’. Key in the proof of the present expression rate bounds is the use of either super-expressive activations (e.g. [Yarotski: Elementary superexpressive activations, Int. Conf. on ML, 2021], [Shen, Yang and Zhang: Neural network approximation: Three hidden layers are enough, Neural Networks, 2021], and the references there) which are inspired by the Kolmogorov superposition theorem, or of nonstandard NN architectures with standard (ReLU) activations as recently proposed in [Zhang, Shen and Yang: Neural Network Architecture Beyond Width and Depth, Adv. in Neural Inf. Proc. Sys., 2022]. We illustrate the abstract results by approximation rate bounds for emulation of a) solution operators for parametric elliptic variational inequalities, and b) Lipschitz maps of Hilbert-Schmidt operators.
READ FULL TEXT