Rank Bounds for Approximating Gaussian Densities in the Tensor-Train Format

by   Paul B. Rohrbach, et al.

Low rank tensor approximations have been employed successfully, for example, to build surrogate models that can be used to speed up large-scale inference problems in high dimensions. The success of this depends critically on the rank that is necessary to represent or approximate the underlying distribution. In this paper, we develop a-priori rank bounds for approximations in the functional Tensor-Train representation for the case of a Gaussian (normally distributed) model. We show that under suitable conditions on the precision matrix, we can represent the Gaussian density to high accuracy without suffering from an exponential growth of complexity as the dimension increases. Our results provide evidence of the suitability and limitations of low rank tensor methods in a simple but important model case. Numerical experiments confirm that the rank bounds capture the qualitative behavior of the rank structure when varying the parameters of the precision matrix and the accuracy of the approximation.


page 1

page 2

page 3

page 4


Efficient randomized tensor-based algorithms for function approximation and low-rank kernel interactions

In this paper, we introduce a method for multivariate function approxima...

Continuous dictionaries meet low-rank tensor approximations

In this short paper we bridge two seemingly unrelated sparse approximati...

Computing f-Divergences and Distances of High-Dimensional Probability Density Functions – Low-Rank Tensor Approximations

Very often, in the course of uncertainty quantification tasks or data an...

On the numerical rank of radial basis function kernels in high dimension

Low-rank approximations are popular methods to reduce the high computati...

Approximate Cross-Validation with Low-Rank Data in High Dimensions

Many recent advances in machine learning are driven by a challenging tri...

Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion

Transport maps have become a popular mechanic to express complicated pro...

Randomized algorithms for low-rank tensor decompositions in the Tucker format

Many applications in data science and scientific computing involve large...