Estimating 2-Sinkhorn Divergence between Gaussian Processes from Finite-Dimensional Marginals

02/05/2021 βˆ™ by Anton Mallasto, et al. βˆ™ 0 βˆ™

Optimal Transport (OT) has emerged as an important computational tool in machine learning and computer vision, providing a geometrical framework for studying probability measures. OT unfortunately suffers from the curse of dimensionality and requires regularization for practical computations, of which the entropic regularization is a popular choice, which can be 'unbiased', resulting in a Sinkhorn divergence. In this work, we study the convergence of estimating the 2-Sinkhorn divergence between Gaussian processes (GPs) using their finite-dimensional marginal distributions. We show almost sure convergence of the divergence when the marginals are sampled according to some base measure. Furthermore, we show that using n marginals the estimation error of the divergence scales in a dimension-free way as π’ͺ(Ο΅^ -1n^-1/2), where Ο΅ is the magnitude of entropic regularization.



There are no comments yet.


page 8

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.