
Randomised maximum likelihood based posterior sampling
Minimization of a stochastic cost function is commonly used for approxim...
read it

Approximation and sampling of multivariate probability distributions in the tensor train decomposition
General multivariate distributions are notoriously expensive to sample f...
read it

Deep Composition of Tensor Trains using Squared Inverse Rosenblatt Transports
Characterising intractable highdimensional random variables is one of t...
read it

Rank Bounds for Approximating Gaussian Densities in the TensorTrain Format
Low rank tensor approximations have been employed successfully, for exam...
read it

Transport Monte Carlo
In Bayesian inference, transport map is a promising alternative to the c...
read it

Spectral convergence of probability densities
The computation of probability density functions (PDF) using approximate...
read it

Sparse approximation of triangular transports on bounded domains
Let ρ and π be two probability measures on [1,1]^d with positive and an...
read it
Lowrank tensor reconstruction of concentrated densities with application to Bayesian inversion
Transport maps have become a popular mechanic to express complicated probability densities using sample propagation through an optimized pushforward. Beside their broad applicability and wellknown success, transport maps suffer from several drawbacks such as numerical inaccuracies induced by the optimization process and the fact that sampling schemes have to be employed when quantities of interest, e.g. moments are to compute. This paper presents a novel method for the accurate functional approximation of probability density functions (PDF) that copes with those issues. By interpreting the pullback result of a target PDF through an inexact transport map as a perturbed reference density, a subsequent functional representation in a more accessible format allows for efficient and more accurate computation of the desired quantities. We introduce a layerbased approximation of the perturbed reference density in an appropriate coordinate system to split the highdimensional representation problem into a set of independent approximations for which separately chosen orthonormal basis functions are available. This effectively motivates the notion of h and prefinement (i.e. “mesh size” and polynomial degree) for the approximation of highdimensional PDFs. To circumvent the curse of dimensionality and enable samplingfree access to certain quantities of interest, a lowrank reconstruction in the tensor train format is employed via the Variational Monte Carlo method. An a priori convergence analysis of the developed approach is derived in terms of Hellinger distance and the KullbackLeibler divergence. Applications comprising Bayesian inverse problems and several degrees of concentrated densities illuminate the (superior) convergence in comparison to Monte Carlo and MarkovChain Monte Carlo methods.
READ FULL TEXT
Comments
There are no comments yet.