DeepAI AI Chat
Log In Sign Up

Neural Operator: Learning Maps Between Function Spaces

08/19/2021
by   Nikola Kovachki, et al.
Purdue University
California Institute of Technology
17

The classical development of neural networks has primarily focused on learning mappings between finite dimensional Euclidean spaces or finite sets. We propose a generalization of neural networks tailored to learn operators mapping between infinite dimensional function spaces. We formulate the approximation of operators by composition of a class of linear integral operators and nonlinear activation functions, so that the composed operator can approximate complex nonlinear operators. We prove a universal approximation theorem for our construction. Furthermore, we introduce four classes of operator parameterizations: graph-based operators, low-rank operators, multipole graph-based operators, and Fourier operators and describe efficient algorithms for computing with each one. The proposed neural operators are resolution-invariant: they share the same network parameters between different discretizations of the underlying function spaces and can be used for zero-shot super-resolutions. Numerically, the proposed models show superior performance compared to existing machine learning based methodologies on Burgers' equation, Darcy flow, and the Navier-Stokes equation, while being several order of magnitude faster compared to conventional PDE solvers.

READ FULL TEXT

page 3

page 34

page 36

page 40

10/18/2020

Fourier Neural Operator for Parametric Partial Differential Equations

The classical development of neural networks has primarily focused on le...
09/12/2022

Bounding the Rademacher Complexity of Fourier neural operators

A Fourier neural operator (FNO) is one of the physics-inspired machine l...
02/01/2023

Graph Neural Operators for Classification of Spatial Transcriptomics Data

The inception of spatial transcriptomics has allowed improved comprehens...
03/13/2023

NeurEPDiff: Neural Operators to Predict Geodesics in Deformation Spaces

This paper presents NeurEPDiff, a novel network to fast predict the geod...
05/21/2022

Spectral Neural Operators

A plentitude of applications in scientific computing requires the approx...
12/01/2022

An Introduction to Kernel and Operator Learning Methods for Homogenization by Self-consistent Clustering Analysis

Recent advances in operator learning theory have improved our knowledge ...