Estimating Barycenters of Measures in High Dimensions

07/14/2020
by   Samuel Cohen, et al.
15

Barycentric averaging is a principled way of summarizing populations of measures. Existing algorithms for estimating barycenters typically parametrize them as weighted sums of Diracs and optimize their weights and/or locations. However, these approaches do not scale to high-dimensional settings due to the curse of dimensionality. In this paper, we propose a scalable and general algorithm for estimating barycenters of measures in high dimensions. The key idea is to turn the optimization over measures into an optimization over generative models, introducing inductive biases that allow the method to scale while still accurately estimating barycenters. We prove local convergence under mild assumptions on the discrepancy showing that the approach is well-posed. We demonstrate that our method is fast, achieves good performance on low-dimensional problems, and scales to high-dimensional settings. In particular, our approach is the first to be used to estimate barycenters in thousands of dimensions.

READ FULL TEXT
research
08/05/2014

Spoke Darts for Efficient High Dimensional Blue Noise Sampling

Blue noise refers to sample distributions that are random and well-space...
research
12/08/2021

Estimating Divergences in High Dimensions

The problem of estimating the divergence between 2 high dimensional dist...
research
03/07/2023

Extremes in High Dimensions: Methods and Scalable Algorithms

Extreme-value theory has been explored in considerable detail for univar...
research
09/07/2022

Local Projection Inference in High Dimensions

In this paper, we estimate impulse responses by local projections in hig...
research
04/09/2019

Kernelized Complete Conditional Stein Discrepancy

Much of machine learning relies on comparing distributions with discrepa...
research
03/19/2022

Measuring the severity of multi-collinearity in high dimensions

Multi-collinearity is a wide-spread phenomenon in modern statistical app...
research
01/22/2018

Rover Descent: Learning to optimize by learning to navigate on prototypical loss surfaces

Learning to optimize - the idea that we can learn from data algorithms t...

Please sign up or login with your details

Forgot password? Click here to reset