Gradient Descent with Compressed Iterates

09/10/2019
by   Ahmed Khaled, et al.
28

We propose and analyze a new type of stochastic first order method: gradient descent with compressed iterates (GDCI). GDCI in each iteration first compresses the current iterate using a lossy randomized compression technique, and subsequently takes a gradient step. This method is a distillation of a key ingredient in the current practice of federated learning, where a model needs to be compressed by a mobile device before it is sent back to a server for aggregation. Our analysis provides a step towards closing the gap between the theory and practice of federated learning, and opens the possibility for many extensions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/18/2022

FLECS-CGD: A Federated Learning Second-Order Framework via Compression and Sketching with Compressed Gradient Differences

In the recent paper FLECS (Agafonov et al, FLECS: A Federated Learning S...
research
09/10/2019

First Analysis of Local GD on Heterogeneous Data

We provide the first convergence analysis of local gradient descent for ...
research
12/20/2019

Distributed Fixed Point Methods with Compressed Iterates

We propose basic and natural assumptions under which iterative optimizat...
research
09/11/2020

Federated Generalized Bayesian Learning via Distributed Stein Variational Gradient Descent

This paper introduces Distributed Stein Variational Gradient Descent (DS...
research
12/15/2021

Communication-Efficient Distributed SGD with Compressed Sensing

We consider large scale distributed optimization over a set of edge devi...
research
04/04/2022

FedSynth: Gradient Compression via Synthetic Data in Federated Learning

Model compression is important in federated learning (FL) with large mod...
research
05/08/2022

Federated Random Reshuffling with Compression and Variance Reduction

Random Reshuffling (RR), which is a variant of Stochastic Gradient Desce...

Please sign up or login with your details

Forgot password? Click here to reset