DeepAI AI Chat
Log In Sign Up

Distributed Computation of Wasserstein Barycenters over Networks

by   César A. Uribe, et al.

We propose a new class-optimal algorithm for the distributed computation of Wasserstein Barycenters over networks. Assuming that each node in a graph has a probability distribution, we prove that every node is able to reach the barycenter of all distributions held in the network by using local interactions compliant with the topology of the graph. We show the minimum number of communication rounds required for the proposed method to achieve arbitrary relative precision both in the optimality of the solution and the consensus among all agents for undirected fixed networks.


Communication-efficient Decentralized Local SGD over Undirected Networks

We consider the distributed learning problem where a network of n agents...

Decentralize and Randomize: Faster Algorithm for Wasserstein Barycenters

We study the problem of decentralized distributed computation of a discr...

Scalable Average Consensus with Compressed Communications

We propose a new decentralized average consensus algorithm with compress...

Local Mixing Time: Distributed Computation and Applications

The mixing time of a graph is an important metric, which is not only use...

Distributed Learning of Average Belief Over Networks Using Sequential Observations

This paper addresses the problem of distributed learning of average beli...

A Hierarchical Model for Fast Distributed Consensus in Dynamic Networks

We present two new consensus algorithms for dynamic networks. The first,...

Wasserstein Neural Processes

Neural Processes (NPs) are a class of models that learn a mapping from a...