Faster Stochastic Alternating Direction Method of Multipliers for Nonconvex Optimization

08/04/2020 ∙ by Feihu Huang, et al. ∙ 0 ∙

In this paper, we propose a faster stochastic alternating direction method of multipliers (ADMM) for nonconvex optimization by using a new stochastic path-integrated differential estimator (SPIDER), called as SPIDER-ADMM. Moreover, we prove that the SPIDER-ADMM achieves a record-breaking incremental first-order oracle (IFO) complexity of 𝒪(n+n^1/2ϵ^-1) for finding an ϵ-approximate stationary point, which improves the deterministic ADMM by a factor 𝒪(n^1/2), where n denotes the sample size. As one of major contribution of this paper, we provide a new theoretical analysis framework for nonconvex stochastic ADMM methods with providing the optimal IFO complexity. Based on this new analysis framework, we study the unsolved optimal IFO complexity of the existing non-convex SVRG-ADMM and SAGA-ADMM methods, and prove they have the optimal IFO complexity of 𝒪(n+n^2/3ϵ^-1). Thus, the SPIDER-ADMM improves the existing stochastic ADMM methods by a factor of 𝒪(n^1/6). Moreover, we extend SPIDER-ADMM to the online setting, and propose a faster online SPIDER-ADMM. Our theoretical analysis shows that the online SPIDER-ADMM has the IFO complexity of 𝒪(ϵ^-3/2), which improves the existing best results by a factor of 𝒪(ϵ^-1/2). Finally, the experimental results on benchmark datasets validate that the proposed algorithms have faster convergence rate than the existing ADMM algorithms for nonconvex optimization.



There are no comments yet.


page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.