Distributed and Stochastic Optimization Methods with Gradient Compression and Local Steps

12/20/2021
by   Eduard Gorbunov, et al.
0

In this thesis, we propose new theoretical frameworks for the analysis of stochastic and distributed methods with error compensation and local updates. Using these frameworks, we develop more than 20 new optimization methods, including the first linearly converging Error-Compensated SGD and the first linearly converging Local-SGD for arbitrarily heterogeneous local functions. Moreover, the thesis contains several new distributed methods with unbiased compression for distributed non-convex optimization problems. The derived complexity results for these methods outperform the previous best-known results for the considered problems. Finally, we propose a new scalable decentralized fault-tolerant distributed method, and under reasonable assumptions, we derive the iteration complexity bounds for this method that match the ones of centralized Local-SGD.

READ FULL TEXT
research
03/23/2020

A Unified Theory of Decentralized SGD with Changing Topology and Local Updates

Decentralized stochastic optimization methods have gained a lot of atten...
research
09/01/2021

The Minimax Complexity of Distributed Optimization

In this thesis, I study the minimax oracle complexity of distributed sto...
research
12/09/2015

Efficient Distributed SGD with Variance Reduction

Stochastic Gradient Descent (SGD) has become one of the most popular opt...
research
09/04/2020

On Communication Compression for Distributed Optimization on Heterogeneous Data

Lossy gradient compression, with either unbiased or biased compressors, ...
research
10/09/2017

SGD for robot motion? The effectiveness of stochastic optimization on a new benchmark for biped locomotion tasks

Trajectory optimization and posture generation are hard problems in robo...
research
01/11/2023

Compression for Distributed Optimization and Timely Updates

The goal of this thesis is to study the compression problems arising in ...
research
01/27/2019

99

It is well known that many optimization methods, including SGD, SAGA, an...

Please sign up or login with your details

Forgot password? Click here to reset