Learning Operators with Mesh-Informed Neural Networks

03/22/2022
by   Nicola Rares Franco, et al.
0

Thanks to their universal approximation properties and new efficient training strategies, Deep Neural Networks are becoming a valuable tool for the approximation of mathematical operators. In the present work, we introduce Mesh-Informed Neural Networks (MINNs), a class of architectures specifically tailored to handle mesh based functional data, and thus of particular interest for reduced order modeling of parametrized Partial Differential Equations (PDEs). The driving idea behind MINNs is to embed hidden layers into discrete functional spaces of increasing complexity, obtained through a sequence of meshes defined over the underlying spatial domain. The approach leads to a natural pruning strategy which enables the design of sparse architectures that are able to learn general nonlinear operators. We assess this strategy through an extensive set of numerical experiments, ranging from nonlocal operators to nonlinear diffusion PDEs, where MINNs are compared to classical fully connected Deep Neural Networks. Our results show that MINNs can handle functional data defined on general domains of any shape, while ensuring reduced training times, lower computational costs, and better generalization capabilities, thus making MINNs very well-suited for demanding applications such as Reduced Order Modeling and Uncertainty Quantification for PDEs.

READ FULL TEXT
research
08/02/2022

Approximate Bayesian Neural Operators: Uncertainty Quantification for Parametric PDEs

Neural operators are a type of deep architecture that learns to solve (i...
research
06/07/2020

Bayesian Hidden Physics Models: Uncertainty Quantification for Discovery of Nonlinear Partial Differential Operators from Data

What do data tell us about physics-and what don't they tell us? There ha...
research
09/14/2023

Nonlinear model order reduction for problems with microstructure using mesh informed neural networks

Many applications in computational physics involve approximating problem...
research
05/01/2023

Predictions Based on Pixel Data: Insights from PDEs and Finite Differences

Neural networks are the state-of-the-art for many approximation tasks in...
research
03/10/2021

A Deep Learning approach to Reduced Order Modelling of Parameter Dependent Partial Differential Equations

Within the framework of parameter dependent PDEs, we develop a construct...
research
12/13/2022

Reliable extrapolation of deep neural operators informed by physics or sparse observations

Deep neural operators can learn nonlinear mappings between infinite-dime...
research
07/28/2020

Depth separation for reduced deep networks in nonlinear model reduction: Distilling shock waves in nonlinear hyperbolic problems

Classical reduced models are low-rank approximations using a fixed basis...

Please sign up or login with your details

Forgot password? Click here to reset