Sparse approximation of triangular transports on bounded domains

06/12/2020
by   Jakob Zech, et al.
0

Let ρ and π be two probability measures on [-1,1]^d with positive and analytic Lebesgue densities. We investigate the approximation of the unique triangular monotone (Knothe-Rosenblatt) transport T:[-1,1]^d→ [-1,1]^d, such that the pushforward T_♯ρ equals π. It is shown that for d∈ℕ there exist approximations T̃ of T based on either sparse polynomial expansions or ReLU networks, such that the distance between T̃_♯ρ and π decreases exponentially. More precisely, we show error bounds of the type (-β N^1/d) (or (-β N^1/(d+1)) for neural networks), where N refers to the dimension of the ansatz space (or the size of the network) containing T̃; the notion of distance comprises, among others, the Hellinger distance and the Kullback–Leibler divergence. The construction guarantees T̃ to be a monotone triangular bijective transport on the hypercube [-1,1]^d. Analogous results hold for the inverse transport S=T^-1. The proofs are constructive, and we give an explicit a priori description of the ansatz space, which can be used for numerical implementations. Additionally we discuss the high-dimensional case: for d=∞ a dimension-independent algebraic convergence rate is proved for a class of probability measures occurring widely in Bayesian inference for uncertainty quantification, thus verifying that the curse of dimensionality can be overcome in this setting.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset