The curse of dimensionality in operator learning

06/28/2023
by   Samuel Lanthaler, et al.
0

Neural operator architectures employ neural networks to approximate operators mapping between Banach spaces of functions; they may be used to accelerate model evaluations via emulation, or to discover models from data. Consequently, the methodology has received increasing attention over recent years, giving rise to the rapidly growing field of operator learning. The first contribution of this paper is to prove that for general classes of operators which are characterized only by their C^r- or Lipschitz-regularity, operator learning suffers from a curse of dimensionality, defined precisely here in terms of representations of the infinite-dimensional input and output function spaces. The result is applicable to a wide variety of existing neural operators, including PCA-Net, DeepONet and the FNO. The second contribution of the paper is to prove that the general curse of dimensionality can be overcome for solution operators defined by the Hamilton-Jacobi equation; this is achieved by leveraging additional structure in the underlying solution operator, going beyond regularity. To this end, a novel neural operator architecture is introduced, termed HJ-Net, which explicitly takes into account characteristic information of the underlying Hamiltonian system. Error and complexity estimates are derived for HJ-Net which show that this architecture can provably beat the curse of dimensionality related to the infinite-dimensional input and output function spaces.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/28/2023

Operator learning with PCA-Net: upper and lower complexity bounds

PCA-Net is a recently proposed neural operator architecture which combin...
research
08/19/2021

Neural Operator: Learning Maps Between Function Spaces

The classical development of neural networks has primarily focused on le...
research
04/26/2023

Kernel Methods are Competitive for Operator Learning

We present a general kernel-based framework for learning operators betwe...
research
05/31/2023

Are Neural Operators Really Neural Operators? Frame Theory Meets Operator Learning

Recently, there has been significant interest in operator learning, i.e....
research
04/13/2022

Neural Operator with Regularity Structure for Modeling Dynamics Driven by SPDEs

Stochastic partial differential equations (SPDEs) are significant tools ...
research
04/26/2023

The Nonlocal Neural Operator: Universal Approximation

Neural operator architectures approximate operators between infinite-dim...
research
03/02/2022

BGG sequences with weak regularity and applications

We investigate some Bernstein-Gelfand-Gelfand (BGG) complexes on bounded...

Please sign up or login with your details

Forgot password? Click here to reset