Acceleration of stochastic methods on the example of decentralized SGD

11/15/2020 ∙ by Trimbach Ekaterina, et al. ∙ 0

In this paper, we present an algorithm for accelerating decentralized stochastic gradient descent. Recently, decentralized stochastic optimization methods have attracted a lot of attention, mainly due to their low iteration cost, data locality and data exchange efficiency. They are generalizations of algorithms such as SGD and Local SGD. An additional important contribution of this work is the additions to the analysis of acceleration of stochastic methods, which allows achieving acceleration in the decentralized case.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.