DeepAI AI Chat
Log In Sign Up

Neural Operator: Learning Maps Between Function Spaces

by   Nikola Kovachki, et al.
Purdue University
California Institute of Technology

The classical development of neural networks has primarily focused on learning mappings between finite dimensional Euclidean spaces or finite sets. We propose a generalization of neural networks tailored to learn operators mapping between infinite dimensional function spaces. We formulate the approximation of operators by composition of a class of linear integral operators and nonlinear activation functions, so that the composed operator can approximate complex nonlinear operators. We prove a universal approximation theorem for our construction. Furthermore, we introduce four classes of operator parameterizations: graph-based operators, low-rank operators, multipole graph-based operators, and Fourier operators and describe efficient algorithms for computing with each one. The proposed neural operators are resolution-invariant: they share the same network parameters between different discretizations of the underlying function spaces and can be used for zero-shot super-resolutions. Numerically, the proposed models show superior performance compared to existing machine learning based methodologies on Burgers' equation, Darcy flow, and the Navier-Stokes equation, while being several order of magnitude faster compared to conventional PDE solvers.


page 3

page 34

page 36

page 40


Fourier Neural Operator for Parametric Partial Differential Equations

The classical development of neural networks has primarily focused on le...

Bounding the Rademacher Complexity of Fourier neural operators

A Fourier neural operator (FNO) is one of the physics-inspired machine l...

Graph Neural Operators for Classification of Spatial Transcriptomics Data

The inception of spatial transcriptomics has allowed improved comprehens...

NeurEPDiff: Neural Operators to Predict Geodesics in Deformation Spaces

This paper presents NeurEPDiff, a novel network to fast predict the geod...

Spectral Neural Operators

A plentitude of applications in scientific computing requires the approx...

An Introduction to Kernel and Operator Learning Methods for Homogenization by Self-consistent Clustering Analysis

Recent advances in operator learning theory have improved our knowledge ...