Reliable amortized variational inference with physics-based latent distribution correction

by   Ali Siahkoohi, et al.

Bayesian inference for high-dimensional inverse problems is challenged by the computational costs of the forward operator and the selection of an appropriate prior distribution. Amortized variational inference addresses these challenges where a neural network is trained to approximate the posterior distribution over existing pairs of model and data. When fed previously unseen data and normally distributed latent samples as input, the pretrained deep neural network – in our case a conditional normalizing flow – provides posterior samples with virtually no cost. However, the accuracy of this approach relies on the availability of high-fidelity training data, which seldom exists in geophysical inverse problems due to the heterogeneous structure of the Earth. In addition, accurate amortized variational inference requires the observed data to be drawn from the training data distribution. As such, we propose to increase the resilience of amortized variational inference when faced with data distribution shift via a physics-based correction to the conditional normalizing flow latent distribution. To accomplish this, instead of a standard Gaussian latent distribution, we parameterize the latent distribution by a Gaussian distribution with an unknown mean and diagonal covariance. These unknown quantities are then estimated by minimizing the Kullback-Leibler divergence between the corrected and true posterior distributions. While generic and applicable to other inverse problems, by means of a seismic imaging example, we show that our correction step improves the robustness of amortized variational inference with respect to changes in number of source experiments, noise variance, and shifts in the prior distribution. This approach provides a seismic image with limited artifacts and an assessment of its uncertainty with approximately the same cost as five reverse-time migrations.


page 14

page 15

page 21

page 23

page 28

page 30

page 31

page 33


Preconditioned training of normalizing flows for variational inference in inverse problems

Obtaining samples from the posterior distribution of inverse problems wi...

Refining Amortized Posterior Approximations using Gradient-Based Summary Statistics

We present an iterative framework to improve the amortized approximation...

Learning by example: fast reliability-aware seismic imaging with normalizing flows

Uncertainty quantification provides quantitative measures on the reliabi...

HypoSVI: Hypocenter inversion with Stein variational inference and Physics Informed Neural Networks

We introduce a scheme for probabilistic hypocenter inversion with Stein ...

Metric Gaussian Variational Inference

A variational Gaussian approximation of the posterior distribution can b...

Please sign up or login with your details

Forgot password? Click here to reset