Sparse approximation of triangular transports on bounded domains

06/12/2020
by   Jakob Zech, et al.
0

Let ρ and π be two probability measures on [-1,1]^d with positive and analytic Lebesgue densities. We investigate the approximation of the unique triangular monotone (Knothe-Rosenblatt) transport T:[-1,1]^d→ [-1,1]^d, such that the pushforward T_♯ρ equals π. It is shown that for d∈ℕ there exist approximations T̃ of T based on either sparse polynomial expansions or ReLU networks, such that the distance between T̃_♯ρ and π decreases exponentially. More precisely, we show error bounds of the type (-β N^1/d) (or (-β N^1/(d+1)) for neural networks), where N refers to the dimension of the ansatz space (or the size of the network) containing T̃; the notion of distance comprises, among others, the Hellinger distance and the Kullback–Leibler divergence. The construction guarantees T̃ to be a monotone triangular bijective transport on the hypercube [-1,1]^d. Analogous results hold for the inverse transport S=T^-1. The proofs are constructive, and we give an explicit a priori description of the ansatz space, which can be used for numerical implementations. Additionally we discuss the high-dimensional case: for d=∞ a dimension-independent algebraic convergence rate is proved for a class of probability measures occurring widely in Bayesian inference for uncertainty quantification, thus verifying that the curse of dimensionality can be overcome in this setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/27/2023

An Approximation Theory Framework for Measure-Transport Sampling Algorithms

This article presents a general approximation-theoretic framework to ana...
research
07/28/2021

Sparse approximation of triangular transports. Part II: the infinite dimensional case

For two probability measures ρ and π on [-1,1]^ℕ we investigate the appr...
research
10/23/2020

Exponential ReLU Neural Network Approximation Rates for Point and Edge Singularities

We prove exponential expressivity with stable ReLU Neural Networks (ReLU...
research
01/30/2020

Efficient Approximation of Solutions of Parametric Linear Transport Equations by ReLU DNNs

We demonstrate that deep neural networks with the ReLU activation functi...
research
08/10/2020

Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion

Transport maps have become a popular mechanic to express complicated pro...
research
11/18/2020

Neural network approximation and estimation of classifiers with classification boundary in a Barron class

We prove bounds for the approximation and estimation of certain classifi...
research
01/19/2021

Spectral convergence of probability densities

The computation of probability density functions (PDF) using approximate...

Please sign up or login with your details

Forgot password? Click here to reset