AdaDelay: Delay Adaptive Distributed Stochastic Convex Optimization

08/20/2015
by   Suvrit Sra, et al.
0

We study distributed stochastic convex optimization under the delayed gradient model where the server nodes perform parameter updates, while the worker nodes compute stochastic gradients. We discuss, analyze, and experiment with a setup motivated by the behavior of real-world distributed computation networks, where the machines are differently slow at different time. Therefore, we allow the parameter updates to be sensitive to the actual delays experienced, rather than to worst-case bounds on the maximum delay. This sensitivity leads to larger stepsizes, that can help gain rapid initial convergence without having to wait too long for slower machines, while maintaining the same asymptotic complexity. We obtain encouraging improvements to overall convergence for distributed experiments on real datasets with up to billions of examples and features.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2021

Asynchronous Stochastic Optimization Robust to Arbitrary Delays

We consider stochastic optimization with delayed gradients where, at eac...
research
03/27/2018

DRACO: Robust Distributed Training via Redundant Gradients

Distributed model training is vulnerable to worst-case system failures a...
research
04/28/2011

Distributed Delayed Stochastic Optimization

We analyze the convergence of gradient-based optimization algorithms tha...
research
04/09/2023

SLowcal-SGD: Slow Query Points Improve Local-SGD for Stochastic Convex Optimization

We consider distributed learning scenarios where M machines interact wit...
research
05/20/2023

Non-stationary Online Convex Optimization with Arbitrary Delays

Online convex optimization (OCO) with arbitrary delays, in which gradien...
research
09/10/2022

Optimization of the fluid model of scheduling: local predictions

In this research a continuous model for resource allocations in a queuin...

Please sign up or login with your details

Forgot password? Click here to reset