Approximation and sampling of multivariate probability distributions in the tensor train decomposition

10/02/2018
by   Sergey Dolgov, et al.
0

General multivariate distributions are notoriously expensive to sample from, particularly the high-dimensional posterior distributions in PDE-constrained inverse problems. This paper develops a sampler for arbitrary continuous multivariate distributions that is based on low-rank surrogates in the tensor-train format. We construct a tensor-train approximation to the target probability density function using the cross interpolation, which requires a small number of function evaluations. For sufficiently smooth distributions the storage required for the TT approximation is moderate, scaling linearly with dimension. The structure of the tensor-train surrogate allows efficient sampling by the conditional distribution method. Unbiased estimates may be calculated by correcting the transformed random seeds using a Metropolis--Hastings accept/reject step. Moreover, one can use a more efficient quasi-Monte Carlo quadrature that may be corrected either by a control-variate strategy, or by importance weighting. We prove that the error in the tensor-train approximation propagates linearly into the Metropolis--Hastings rejection rate and the integrated autocorrelation time of the resulting Markov chain. These methods are demonstrated in three computed examples: fitting failure time of shock absorbers; a PDE-constrained inverse diffusion problem; and sampling from the Rosenbrock distribution. The delayed rejection adaptive Metropolis (DRAM) algorithm is used as a benchmark. We find that the importance-weight corrected quasi-Monte Carlo quadrature performs best in all computed examples, and is orders-of-magnitude more efficient than DRAM across a wide range of approximation accuracies and sample sizes. Indeed, all the methods developed here significantly outperform DRAM in all computed examples.

READ FULL TEXT

page 1

page 2

page 3

page 4

01/10/2021

Randomised maximum likelihood based posterior sampling

Minimization of a stochastic cost function is commonly used for approxim...
08/10/2020

Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion

Transport maps have become a popular mechanic to express complicated pro...
11/09/2021

TTRISK: Tensor Train Decomposition Algorithm for Risk Averse Optimization

This article develops a new algorithm named TTRISK to solve high-dimensi...
10/22/2020

Random Coordinate Underdamped Langevin Monte Carlo

The Underdamped Langevin Monte Carlo (ULMC) is a popular Markov chain Mo...
09/09/2019

A direct Hamiltonian MCMC approach for reliability estimation

Accurate and efficient estimation of rare events probabilities is of sig...
06/22/2018

Tensor Monte Carlo: particle methods for the GPU era

Multi-sample objectives improve over single-sample estimates by giving t...
08/19/2019

Probability Estimation with Truncated Inverse Binomial Sampling

In this paper, we develop a general theory of truncated inverse binomial...

Code Repositories

TT-IRT

Inverse Rosenblatt Transform (Conditional Distribution) + MCMC sampling using Tensor Train approximation


view repo