Dimensionality Reduction for Wasserstein Barycenter

10/18/2021
by   Zachary Izzo, et al.
0

The Wasserstein barycenter is a geometric construct which captures the notion of centrality among probability distributions, and which has found many applications in machine learning. However, most algorithms for finding even an approximate barycenter suffer an exponential dependence on the dimension d of the underlying space of the distributions. In order to cope with this "curse of dimensionality," we study dimensionality reduction techniques for the Wasserstein barycenter problem. When the barycenter is restricted to support of size n, we show that randomized dimensionality reduction can be used to map the problem to a space of dimension O(log n) independent of both d and k, and that any solution found in the reduced dimension will have its cost preserved up to arbitrary small error in the original space. We provide matching upper and lower bounds on the size of the reduced dimension, showing that our methods are optimal up to constant factors. We also provide a coreset construction for the Wasserstein barycenter problem that significantly decreases the number of input distributions. The coresets can be used in conjunction with random projections and thus further improve computation time. Lastly, our experimental results validate the speedup provided by dimensionality reduction while maintaining solution quality.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/05/2021

Randomized Dimensionality Reduction for Facility Location and Single-Linkage Clustering

Random dimensionality reduction is a versatile tool for speeding up algo...
research
02/23/2022

A Dimensionality Reduction Method for Finding Least Favorable Priors with a Focus on Bregman Divergence

A common way of characterizing minimax estimators in point estimation is...
research
05/14/2019

Dimensionality Reduction for Tukey Regression

We give the first dimensionality reduction methods for the overconstrain...
research
07/14/2021

Optimality of the Johnson-Lindenstrauss Dimensionality Reduction for Practical Measures

It is well known that the Johnson-Lindenstrauss dimensionality reduction...
research
02/09/2022

Non-Linear Spectral Dimensionality Reduction Under Uncertainty

In this paper, we consider the problem of non-linear dimensionality redu...
research
06/01/2020

Dimensionality Reduction for Sentiment Classification: Evolving for the Most Prominent and Separable Features

In sentiment classification, the enormous amount of textual data, its im...
research
03/26/2023

Interpretable Linear Dimensionality Reduction based on Bias-Variance Analysis

One of the central issues of several machine learning applications on re...

Please sign up or login with your details

Forgot password? Click here to reset