Multi-fidelity Bayesian Neural Networks: Algorithms and Applications

12/31/2020
by   Xuhui Meng, et al.
0

We propose a new class of Bayesian neural networks (BNNs) that can be trained using noisy data of variable fidelity, and we apply them to learn function approximations as well as to solve inverse problems based on partial differential equations (PDEs). These multi-fidelity BNNs consist of three neural networks: The first is a fully connected neural network, which is trained following the maximum a posteriori probability (MAP) method to fit the low-fidelity data; the second is a Bayesian neural network employed to capture the cross-correlation with uncertainty quantification between the low- and high-fidelity data; and the last one is the physics-informed neural network, which encodes the physical laws described by PDEs. For the training of the last two neural networks, we first employ the mean-field variational inference (VI) to maximize the evidence lower bound (ELBO) to obtain informative prior distributions for the hyperparameters in the BNNs, and subsequently we use the Hamiltonian Monte Carlo method to estimate accurately the posterior distributions for the corresponding hyperparameters. We demonstrate the accuracy of the present method using synthetic data as well as real measurements. Specifically, we first approximate a one- and four-dimensional function, and then infer the reaction rates in one- and two-dimensional diffusion-reaction systems. Moreover, we infer the sea surface temperature (SST) in the Massachusetts and Cape Cod Bays using satellite images and in-situ measurements. Taken together, our results demonstrate that the present method can capture both linear and nonlinear correlation between the low- and high-fidelity data adaptively, identify unknown parameters in PDEs, and quantify uncertainties in predictions, given a few scattered noisy high-fidelity data. Finally, we demonstrate that we can effectively and efficiently reduce the uncertainties and hence enhance the prediction accuracy with an active learning approach, using as examples a specific one-dimensional function approximation and an inverse PDE problem.

READ FULL TEXT

page 4

page 17

page 22

research
02/03/2022

Multi-Output Physics-Informed Neural Networks for Forward and Inverse PDE Problems with Uncertainties

Physics-informed neural networks (PINNs) have recently been used to solv...
research
03/13/2020

B-PINNs: Bayesian Physics-Informed Neural Networks for Forward and Inverse PDE Problems with Noisy Data

We propose a Bayesian physics-informed neural network (B-PINN) to solve ...
research
04/29/2021

Transfer Learning on Multi-Fidelity Data

Neural networks (NNs) are often used as surrogates or emulators of parti...
research
10/03/2020

MFPC-Net: Multi-fidelity Physics-Constrained Neural Process

In this work, we propose a network which can utilize computational cheap...
research
11/10/2021

Physics-enhanced deep surrogates for PDEs

We present a "physics-enhanced deep-surrogate ("PEDS") approach towards ...
research
10/21/2019

Learning and Meta-Learning of Stochastic Advection-Diffusion-Reaction Systems from Sparse Measurements

Physics-informed neural networks (PINNs) were recently proposed in [1] a...
research
05/30/2022

Goal-Oriented A-Posteriori Estimation of Model Error as an Aid to Parameter Estimation

In this work, a Bayesian model calibration framework is presented that u...

Please sign up or login with your details

Forgot password? Click here to reset