Are Neural Operators Really Neural Operators? Frame Theory Meets Operator Learning

05/31/2023
by   Francesca Bartolucci, et al.
0

Recently, there has been significant interest in operator learning, i.e. learning mappings between infinite-dimensional function spaces. This has been particularly relevant in the context of learning partial differential equations from data. However, it has been observed that proposed models may not behave as operators when implemented on a computer, questioning the very essence of what operator learning should be. We contend that in addition to defining the operator at the continuous level, some form of continuous-discrete equivalence is necessary for an architecture to genuinely learn the underlying operator, rather than just discretizations of it. To this end, we propose to employ frames, a concept in applied harmonic analysis and signal processing that gives rise to exact and stable discrete representations of continuous signals. Extending these concepts to operators, we introduce a unifying mathematical framework of Representation equivalent Neural Operator (ReNO) to ensure operations at the continuous and discrete level are equivalent. Lack of this equivalence is quantified in terms of aliasing errors. We analyze various existing operator learning architectures to determine whether they fall within this framework, and highlight implications when they fail to do so.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/12/2022

MIONet: Learning multiple-input operators via tensor product

As an emerging paradigm in scientific machine learning, neural operators...
research
06/21/2023

Corrector Operator to Enhance Accuracy and Reliability of Neural Operator Surrogates of Nonlinear Variational Boundary-Value Problems

This work focuses on developing methods for approximating the solution o...
research
06/13/2021

Markov Neural Operators for Learning Chaotic Systems

Chaotic systems are notoriously challenging to predict because of their ...
research
06/28/2023

The curse of dimensionality in operator learning

Neural operator architectures employ neural networks to approximate oper...
research
12/15/2015

Increasing the Action Gap: New Operators for Reinforcement Learning

This paper introduces new optimality-preserving operators on Q-functions...
research
01/25/2019

Flexible Operator Embeddings via Deep Learning

Integrating machine learning into the internals of database management s...
research
10/17/2022

Signal Processing for Implicit Neural Representations

Implicit Neural Representations (INRs) encoding continuous multi-media d...

Please sign up or login with your details

Forgot password? Click here to reset