DeepAI AI Chat
Log In Sign Up

Low-rank Wasserstein polynomial chaos expansions in the framework of optimal transport

by   Robert Gruhlke, et al.

A unsupervised learning approach for the computation of an explicit functional representation of a random vector Y is presented, which only relies on a finite set of samples with unknown distribution. Motivated by recent advances with computational optimal transport for estimating Wasserstein distances, we develop a new Wasserstein multi-element polynomial chaos expansion (WPCE). It relies on the minimization of a regularized empirical Wasserstein metric known as debiased Sinkhorn divergence. As a requirement for an efficient polynomial basis expansion, a suitable (minimal) stochastic coordinate system X has to be determined with the aim to identify ideally independent random variables. This approach generalizes representations through diffeomorphic transport maps to the case of non-continuous and non-injective model classes ℳ with different input and output dimension, yielding the relation Y=ℳ(X) in distribution. Moreover, since the used PCE grows exponentially in the number of random coordinates of X, we introduce an appropriate low-rank format given as stacks of tensor trains, which alleviates the curse of dimensionality, leading to only linear dependence on the input dimension. By the choice of the model class ℳ and the smooth loss function, higher order optimization schemes become possible. It is shown that the relaxation to a discontinuous model class is necessary to explain multimodal distributions. Moreover, the proposed framework is applied to a numerical upscaling task, considering a computationally challenging microscopic random non-periodic composite material. This leads to tractable effective macroscopic random field in adopted stochastic coordinates.


page 13

page 17

page 18

page 20

page 21


Wasserstein barycenters are NP-hard to compute

The problem of computing Wasserstein barycenters (a.k.a. Optimal Transpo...

Discrete Optimal Transport with Independent Marginals is #P-Hard

We study the computational complexity of the optimal transport problem t...

Wasserstein Neural Processes

Neural Processes (NPs) are a class of models that learn a mapping from a...

Optimal Transport Tools (OTT): A JAX Toolbox for all things Wasserstein

Optimal transport tools (OTT-JAX) is a Python toolbox that can solve opt...

Optimal Transport to a Variety

We study the problem of minimizing the Wasserstein distance between a pr...

PSF field learning based on Optimal Transport Distances

Context: in astronomy, observing large fractions of the sky within a rea...

CytOpT: Optimal Transport with Domain Adaptation for Interpreting Flow Cytometry data

The automated analysis of flow cytometry measurements is an active resea...