Accelerated Symmetric ADMM and Its Applications in Signal Processing
The alternating direction method of multipliers (ADMM) were extensively investigated in the past decades for solving separable convex optimization problems. Fewer researchers focused on exploring its convergence properties for the nonconvex case although it performed surprisingly efficient. In this paper, we propose a symmetric ADMM based on different acceleration techniques for a family of potentially nonsmooth nonconvex programing problems with equality constraints, where the dual variables are updated twice with different stepsizes. Under proper assumptions instead of using the so-called Kurdyka-Lojasiewicz inequality, convergence of the proposed algorithm as well as its pointwise iteration-complexity are analyzed in terms of the corresponding augmented Lagrangian function and the primal-dual residuals, respectively. Performance of our algorithm is verified by some preliminary numerical examples on applications in sparse nonconvex/convex regularized minimization signal processing problems.
READ FULL TEXT