Is Local SGD Better than Minibatch SGD?

02/18/2020
by   Blake Woodworth, et al.
5

We study local SGD (also known as parallel SGD and federated averaging), a natural and frequently used stochastic distributed optimization method. Its theoretical foundations are currently lacking and we highlight how all existing error guarantees in the convex setting are dominated by a simple baseline, minibatch SGD. (1) For quadratic objectives we prove that local SGD strictly dominates minibatch SGD and that accelerated local SGD is minimax optimal for quadratics; (2) For general convex objectives we provide the first guarantee that at least sometimes improves over minibatch SGD; (3) We show that indeed local SGD does not dominate minibatch SGD by presenting a lower bound on the performance of local SGD that is worse than the minibatch SGD guarantee.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/08/2020

Minibatch vs Local SGD for Heterogeneous Distributed Learning

We analyze Local SGD (aka parallel or federated SGD) and Minibatch SGD i...
research
09/01/2021

The Minimax Complexity of Distributed Optimization

In this thesis, I study the minimax oracle complexity of distributed sto...
research
10/31/2018

MaSS: an Accelerated Stochastic Method for Over-parametrized Learning

In this paper we introduce MaSS (Momentum-added Stochastic Solver), an a...
research
04/09/2023

SLowcal-SGD: Slow Query Points Improve Local-SGD for Stochastic Convex Optimization

We consider distributed learning scenarios where M machines interact wit...
research
06/23/2016

Parallel SGD: When does averaging help?

Consider a number of workers running SGD independently on the same pool ...
research
11/03/2020

Local SGD: Unified Theory and New Efficient Methods

We present a unified framework for analyzing local SGD methods in the co...
research
10/25/2020

Local SGD for Saddle-Point Problems

GAN is one of the most popular and commonly used neural network models. ...

Please sign up or login with your details

Forgot password? Click here to reset