Low-Rank Sinkhorn Factorization

03/08/2021
by   Meyer Scetbon, et al.
55

Several recent applications of optimal transport (OT) theory to machine learning have relied on regularization, notably entropy and the Sinkhorn algorithm. Because matrix-vector products are pervasive in the Sinkhorn algorithm, several works have proposed to approximate kernel matrices appearing in its iterations using low-rank factors. Another route lies instead in imposing low-rank constraints on the feasible set of couplings considered in OT problems, with no approximations on cost nor kernel matrices. This route was first explored by Forrow et al., 2018, who proposed an algorithm tailored for the squared Euclidean ground cost, using a proxy objective that can be solved through the machinery of regularized 2-Wasserstein barycenters. Building on this, we introduce in this work a generic approach that aims at solving, in full generality, the OT problem under low-rank constraints with arbitrary costs. Our algorithm relies on an explicit factorization of low rank couplings as a product of sub-coupling factors linked by a common marginal; similar to an NMF approach, we alternatively updates these factors. We prove the non-asymptotic stationary convergence of this algorithm and illustrate its efficiency on benchmark experiments.

READ FULL TEXT
research
02/15/2022

Low-rank tensor approximations for solving multi-marginal optimal transport problems

By adding entropic regularization, multi-marginal optimal transport prob...
research
06/02/2021

Linear-Time Gromov Wasserstein Distances using Low Rank Couplings and Costs

The ability to compare and align related datasets living in heterogeneou...
research
06/12/2020

Linear Time Sinkhorn Divergences using Positive Features

Although Sinkhorn divergences are now routinely used in data sciences to...
research
07/03/2023

Butterfly factorization by algorithmic identification of rank-one blocks

Many matrices associated with fast transforms posess a certain low-rank ...
research
04/27/2020

Hierarchical Low-Rank Approximation of Regularized Wasserstein distance

Sinkhorn divergence is a measure of dissimilarity between two probabilit...
research
09/17/2022

Improved Generalization Bound and Learning of Sparsity Patterns for Data-Driven Low-Rank Approximation

Learning sketching matrices for fast and accurate low-rank approximation...
research
05/31/2023

Unbalanced Low-rank Optimal Transport Solvers

The relevance of optimal transport methods to machine learning has long ...

Please sign up or login with your details

Forgot password? Click here to reset