DeepAI AI Chat
Log In Sign Up

Estimating 2-Sinkhorn Divergence between Gaussian Processes from Finite-Dimensional Marginals

by   Anton Mallasto, et al.

Optimal Transport (OT) has emerged as an important computational tool in machine learning and computer vision, providing a geometrical framework for studying probability measures. OT unfortunately suffers from the curse of dimensionality and requires regularization for practical computations, of which the entropic regularization is a popular choice, which can be 'unbiased', resulting in a Sinkhorn divergence. In this work, we study the convergence of estimating the 2-Sinkhorn divergence between Gaussian processes (GPs) using their finite-dimensional marginal distributions. We show almost sure convergence of the divergence when the marginals are sampled according to some base measure. Furthermore, we show that using n marginals the estimation error of the divergence scales in a dimension-free way as π’ͺ(Ο΅^ -1n^-1/2), where Ο΅ is the magnitude of entropic regularization.

βˆ™ 06/01/2023

Domain Selection for Gaussian Process Data: An application to electrocardiogram signals

Gaussian Processes and the Kullback-Leibler divergence have been deeply ...
βˆ™ 09/26/2014

Generalized Twin Gaussian Processes using Sharma-Mittal Divergence

There has been a growing interest in mutual information measures due to ...
βˆ™ 05/29/2021

Optimal transport with f-divergence regularization and generalized Sinkhorn algorithm

Entropic regularization provides a generalization of the original optima...
βˆ™ 03/12/2020

Statistical and Topological Properties of Sliced Probability Divergences

The idea of slicing divergences has been proven to be successful when co...
βˆ™ 10/19/2020

On the Difficulty of Unbiased Alpha Divergence Minimization

Several approximate inference algorithms have been proposed to minimize ...
βˆ™ 11/04/2021

Rate of Convergence of Polynomial Networks to Gaussian Processes

We examine one-hidden-layer neural networks with random weights. It is w...