DeepAI AI Chat
Log In Sign Up

DeepONet: Learning nonlinear operators for identifying differential equations based on the universal approximation theorem of operators

10/08/2019
by   Lu Lu, et al.
31

While it is widely known that neural networks are universal approximators of continuous functions, a less known and perhaps more powerful result is that a neural network with a single hidden layer can approximate accurately any nonlinear continuous operator <cit.>. This universal approximation theorem is suggestive of the potential application of neural networks in learning nonlinear operators from data. However, the theorem guarantees only a small approximation error for a sufficient large network, and does not consider the important optimization and generalization errors. To realize this theorem in practice, we propose deep operator networks (DeepONets) to learn operators accurately and efficiently from a relatively small dataset. A DeepONet consists of two sub-networks, one for encoding the input function at a fixed number of sensors x_i, i=1,...,m (branch net), and another for encoding the locations for the output functions (trunk net). We perform systematic simulations for identifying two types of operators, i.e., dynamic systems and partial differential equations, and demonstrate that DeepONet significantly reduces the generalization error compared to the fully-connected networks. We also derive theoretically the dependence of the approximation error in terms of the number of sensors (where the input function is defined) as well as the input function type, and we verify the theorem with computational results. More importantly, we observe high-order error convergence in our computational tests, namely polynomial rates (from half order to fourth order) and even exponential convergence with respect to the training dataset size.

READ FULL TEXT

page 1

page 2

page 3

page 4

02/12/2022

MIONet: Learning multiple-input operators via tensor product

As an emerging paradigm in scientific machine learning, neural operators...
02/17/2022

Enhanced DeepONet for Modeling Partial Differential Operators Considering Multiple Input Functions

Machine learning, especially deep learning is gaining much attention due...
03/22/2022

Learning Operators with Mesh-Informed Neural Networks

Thanks to their universal approximation properties and new efficient tra...
01/04/2022

Learning Operators with Coupled Attention

Supervised operator learning is an emerging machine learning paradigm wi...
05/23/2022

Variable-Input Deep Operator Networks

Existing architectures for operator learning require that the number and...
02/02/2023

Deep neural operators can serve as accurate surrogates for shape optimization: A case study for airfoils

Deep neural operators, such as DeepONets, have changed the paradigm in h...
12/04/2020

Universal Approximation Property of Neural Ordinary Differential Equations

Neural ordinary differential equations (NODEs) is an invertible neural n...

Code Repositories