Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion
Transport maps have become a popular mechanic to express complicated probability densities using sample propagation through an optimized push-forward. Beside their broad applicability and well-known success, transport maps suffer from several drawbacks such as numerical inaccuracies induced by the optimization process and the fact that sampling schemes have to be employed when quantities of interest, e.g. moments are to compute. This paper presents a novel method for the accurate functional approximation of probability density functions (PDF) that copes with those issues. By interpreting the pull-back result of a target PDF through an inexact transport map as a perturbed reference density, a subsequent functional representation in a more accessible format allows for efficient and more accurate computation of the desired quantities. We introduce a layer-based approximation of the perturbed reference density in an appropriate coordinate system to split the high-dimensional representation problem into a set of independent approximations for which separately chosen orthonormal basis functions are available. This effectively motivates the notion of h- and p-refinement (i.e. “mesh size” and polynomial degree) for the approximation of high-dimensional PDFs. To circumvent the curse of dimensionality and enable sampling-free access to certain quantities of interest, a low-rank reconstruction in the tensor train format is employed via the Variational Monte Carlo method. An a priori convergence analysis of the developed approach is derived in terms of Hellinger distance and the Kullback-Leibler divergence. Applications comprising Bayesian inverse problems and several degrees of concentrated densities illuminate the (superior) convergence in comparison to Monte Carlo and Markov-Chain Monte Carlo methods.
READ FULL TEXT