Distributed Fixed Point Methods with Compressed Iterates

by   Sélim Chraibi, et al.

We propose basic and natural assumptions under which iterative optimization methods with compressed iterates can be analyzed. This problem is motivated by the practice of federated learning, where a large model stored in the cloud is compressed before it is sent to a mobile device, which then proceeds with training based on local data. We develop standard and variance reduced methods, and establish communication complexity bounds. Our algorithms are the first distributed methods with compressed iterates, and the first fixed point methods with compressed iterates.


page 1

page 2

page 3

page 4


From Local SGD to Local Fixed Point Methods for Federated Learning

Most algorithms for solving optimization problems or finding saddle poin...

Gradient Descent with Compressed Iterates

We propose and analyze a new type of stochastic first order method: grad...

Adaptive Federated Minimax Optimization with Lower complexities

Federated learning is a popular distributed and privacy-preserving machi...

ALLSAT compressed with wildcards. Part 4: An invitation for C-programmers

The model set of a general Boolean function in CNF is calculated in a co...

Sketching for First Order Method: Efficient Algorithm for Low-Bandwidth Channel and Vulnerability

Sketching is one of the most fundamental tools in large-scale machine le...

Please sign up or login with your details

Forgot password? Click here to reset