Accelerating Perturbed Stochastic Iterates in Asynchronous Lock-Free Optimization

09/30/2021
by   Kaiwen Zhou, et al.
0

We show that stochastic acceleration can be achieved under the perturbed iterate framework (Mania et al., 2017) in asynchronous lock-free optimization, which leads to the optimal incremental gradient complexity for finite-sum objectives. We prove that our new accelerated method requires the same linear speed-up condition as the existing non-accelerated methods. Our core algorithmic discovery is a new accelerated SVRG variant with sparse updates. Empirical results are presented to verify our theoretical findings.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset