First Analysis of Local GD on Heterogeneous Data

09/10/2019
by   Ahmed Khaled, et al.
32

We provide the first convergence analysis of local gradient descent for minimizing the average of smooth and convex but otherwise arbitrary functions. Problems of this form and local gradient descent as a solution method are of importance in federated learning, where each function is based on private data stored by a user on a mobile device, and the data of different users can be arbitrarily heterogeneous. We show that in a low accuracy regime, the method has the same communication complexity as gradient descent.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/10/2019

Gradient Descent with Compressed Iterates

We propose and analyze a new type of stochastic first order method: grad...
research
02/18/2022

ProxSkip: Yes! Local Gradient Steps Provably Lead to Communication Acceleration! Finally!

We introduce ProxSkip – a surprisingly simple and provably efficient met...
research
11/04/2018

A Function Fitting Method

In this article we present a function fitting method, which is a convex ...
research
12/31/2020

Bayesian Federated Learning over Wireless Networks

Federated learning is a privacy-preserving and distributed training meth...
research
11/30/2021

Survey Descent: A Multipoint Generalization of Gradient Descent for Nonsmooth Optimization

For strongly convex objectives that are smooth, the classical theory of ...
research
10/21/2019

Adaptive gradient descent without descent

We present a strikingly simple proof that two rules are sufficient to au...
research
06/11/2021

LocoProp: Enhancing BackProp via Local Loss Optimization

We study a local loss construction approach for optimizing neural networ...

Please sign up or login with your details

Forgot password? Click here to reset