DeepAI AI Chat
Log In Sign Up

Distributed Computation of Wasserstein Barycenters over Networks

03/08/2018
by   César A. Uribe, et al.
0

We propose a new class-optimal algorithm for the distributed computation of Wasserstein Barycenters over networks. Assuming that each node in a graph has a probability distribution, we prove that every node is able to reach the barycenter of all distributions held in the network by using local interactions compliant with the topology of the graph. We show the minimum number of communication rounds required for the proposed method to achieve arbitrary relative precision both in the optimality of the solution and the consensus among all agents for undirected fixed networks.

READ FULL TEXT
11/06/2020

Communication-efficient Decentralized Local SGD over Undirected Networks

We consider the distributed learning problem where a network of n agents...
06/11/2018

Decentralize and Randomize: Faster Algorithm for Wasserstein Barycenters

We study the problem of decentralized distributed computation of a discr...
09/14/2021

Scalable Average Consensus with Compressed Communications

We propose a new decentralized average consensus algorithm with compress...
01/05/2018

Local Mixing Time: Distributed Computation and Applications

The mixing time of a graph is an important metric, which is not only use...
11/19/2018

Distributed Learning of Average Belief Over Networks Using Sequential Observations

This paper addresses the problem of distributed learning of average beli...
04/13/2020

A Hierarchical Model for Fast Distributed Consensus in Dynamic Networks

We present two new consensus algorithms for dynamic networks. The first,...
10/01/2019

Wasserstein Neural Processes

Neural Processes (NPs) are a class of models that learn a mapping from a...