Distributed Fixed Point Methods with Compressed Iterates

12/20/2019
by   Sélim Chraibi, et al.
0

We propose basic and natural assumptions under which iterative optimization methods with compressed iterates can be analyzed. This problem is motivated by the practice of federated learning, where a large model stored in the cloud is compressed before it is sent to a mobile device, which then proceeds with training based on local data. We develop standard and variance reduced methods, and establish communication complexity bounds. Our algorithms are the first distributed methods with compressed iterates, and the first fixed point methods with compressed iterates.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/03/2020

From Local SGD to Local Fixed Point Methods for Federated Learning

Most algorithms for solving optimization problems or finding saddle poin...
research
09/10/2019

Gradient Descent with Compressed Iterates

We propose and analyze a new type of stochastic first order method: grad...
research
11/14/2022

Adaptive Federated Minimax Optimization with Lower complexities

Federated learning is a popular distributed and privacy-preserving machi...
research
12/03/2017

ALLSAT compressed with wildcards. Part 4: An invitation for C-programmers

The model set of a general Boolean function in CNF is calculated in a co...
research
02/14/2021

Smoothness Matrices Beat Smoothness Constants: Better Communication Compression Techniques for Distributed Optimization

Large scale distributed optimization has become the default tool for the...
research
05/31/2022

A Computation and Communication Efficient Method for Distributed Nonconvex Problems in the Partial Participation Setting

We present a new method that includes three key components of distribute...
research
10/15/2022

Sketching for First Order Method: Efficient Algorithm for Low-Bandwidth Channel and Vulnerability

Sketching is one of the most fundamental tools in large-scale machine le...

Please sign up or login with your details

Forgot password? Click here to reset