Stochastic Strictly Contractive Peaceman-Rachford Splitting Method
In this paper, we propose a couple of new Stochastic Strictly Contractive Peaceman-Rachford Splitting Method (SCPRSM), called Stochastic SCPRSM (SS-PRSM) and Stochastic Conjugate Gradient SCPRSM (SCG-PRSM) for large-scale optimization problems. The two types of Stochastic PRSM algorithms respectively incorporate stochastic variance reduced gradient (SVRG) and conjugate gradient method. Stochastic PRSM methods and most stochastic ADMM algorithms can only achieve a O(1/√(t)) convergence rate on general convex problems, while our SS-PRSM has a O(1/t) convergence rate in general convexity case which matches the convergence rate of the batch ADMM and SCPRSM algorithms. Besides our methods has faster convergence rate and lower memory cost. SCG-PRSM is the first to improve the performance by incorporating conjugate gradient and using the Armijo line search method. Experiments shows that the proposed algorithms are faster than stochastic and batch ADMM algorithms. The numerical experiments show SCG-PRSM achieve the state-of-the-art performance on our benchmark datasets.
READ FULL TEXT