A New Analysis of Variance Reduced Stochastic Proximal Methods for Composite Optimization with Serial and Asynchronous Realizations

05/25/2018
by   Yue Yu, et al.
0

We provide a comprehensive analysis of stochastic variance reduced gradient (SVRG) based proximal algorithms, both with and without momentum, in serial and asynchronous realizations. Specifically, we propose the Prox-SVRG^++ algorithm, and prove that it has a linear convergence rate with a small epoch length and we obtain an O(1/ϵ) complexity in non-strongly convex case. Then, we propose a momentum accelerated algorithm, called Prox-MSVRG^++, and show that it achieves a complexity of O(1/√(ϵ)). After that, we develop two asynchronous versions based on the above serial algorithms and provide a general analysis under nonconvex and non-strongly convex cases respectively. Our theoretical results indicate that the algorithms can achieve a potentially significant speedup when implemented with multiple servers. We conduct extensive experiments based on 4 real-world datasets on an experiment platform with 11 physical machines. The experiments validate our theoretical findings and demonstrate the effectiveness of our algorithms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset