MFPC-Net: Multi-fidelity Physics-Constrained Neural Process

10/03/2020
by   Yating Wang, et al.
0

In this work, we propose a network which can utilize computational cheap low-fidelity data together with limited high-fidelity data to train surrogate models, where the multi-fidelity data are generated from multiple underlying models. The network takes a context set as input (physical observation points, low fidelity solution at observed points) and output (high fidelity solution at observed points) pairs. It uses the neural process to learn a distribution over functions conditioned on context sets and provide the mean and standard deviation at target sets. Moreover, the proposed framework also takes into account the available physical laws that govern the data and imposes them as constraints in the loss function. The multi-fidelity physical constraint network (MFPC-Net) (1) takes datasets obtained from multiple models at the same time in the training, (2) takes advantage of available physical information, (3) learns a stochastic process which can encode prior beliefs about the correlation between two fidelity with a few observations, and (4) produces predictions with uncertainty. The ability of representing a class of functions is ensured by the property of neural process and is achieved by the global latent variables in the neural network. Physical constraints are added to the loss using Lagrange multipliers. An algorithm to optimize the loss function is proposed to effectively train the parameters in the network on an ad hoc basis. Once trained, one can obtain fast evaluations at the entire domain of interest given a few observation points from a new low-and high-fidelity model pair. Particularly, one can further identify the unknown parameters such as permeability fields in elliptic PDEs with a simple modification of the network. Several numerical examples for both forward and inverse problems are presented to demonstrate the performance of the proposed method.

READ FULL TEXT

Authors

page 16

page 18

page 20

02/01/2019

Bifidelity data-assisted neural networks in nonintrusive reduced-order modeling

In this paper, we present a new nonintrusive reduced basis method when a...
05/27/2021

Neural Network Training Using ℓ_1-Regularization and Bi-fidelity Data

With the capability of accurately representing a functional relationship...
02/11/2020

On transfer learning of neural networks using bi-fidelity data for uncertainty propagation

Due to their high degree of expressiveness, neural networks have recentl...
12/31/2020

Multi-fidelity Bayesian Neural Networks: Algorithms and Applications

We propose a new class of Bayesian neural networks (BNNs) that can be tr...
12/05/2020

Data-based Discovery of Governing Equations

Most common mechanistic models are traditionally presented in mathematic...
08/16/2021

A physics-informed variational DeepONet for predicting the crack path in brittle materials

Failure trajectories, identifying the probable failure zones, and damage...
11/17/2020

Data Driven Modeling of Interfacial Traction Separation Relations using a Thermodynamically Consistent Neural Network

For multilayer structures, interfacial failure is one of the most import...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.