Accelerated first-order primal-dual proximal methods for linearly constrained composite convex programming

06/29/2016
by   Yangyang Xu, et al.
0

Motivated by big data applications, first-order methods have been extremely popular in recent years. However, naive gradient methods generally converge slowly. Hence, much efforts have been made to accelerate various first-order methods. This paper proposes two accelerated methods towards solving structured linearly constrained convex programming, for which we assume composite convex objective. The first method is the accelerated linearized augmented Lagrangian method (LALM). At each update to the primal variable, it allows linearization to the differentiable function and also the augmented term, and thus it enables easy subproblems. Assuming merely weak convexity, we show that LALM owns O(1/t) convergence if parameters are kept fixed during all the iterations and can be accelerated to O(1/t^2) if the parameters are adapted, where t is the number of total iterations. The second method is the accelerated linearized alternating direction method of multipliers (LADMM). In addition to the composite convexity, it further assumes two-block structure on the objective. Different from classic ADMM, our method allows linearization to the objective and also augmented term to make the update simple. Assuming strong convexity on one block variable, we show that LADMM also enjoys O(1/t^2) convergence with adaptive parameters. This result is a significant improvement over that in [Goldstein et. al, SIIMS'14], which requires strong convexity on both block variables and no linearization to the objective or augmented term. Numerical experiments are performed on quadratic programming, image denoising, and support vector machine. The proposed accelerated methods are compared to nonaccelerated ones and also existing accelerated methods. The results demonstrate the validness of acceleration and superior performance of the proposed methods over existing ones.

READ FULL TEXT
research
02/17/2017

Accelerated Primal-Dual Proximal Block Coordinate Updating Methods for Constrained Convex Optimization

Block Coordinate Update (BCU) methods enjoy low per-update computational...
research
11/21/2017

First-order methods for constrained convex programming based on linearized augmented Lagrangian function

First-order methods have been popularly used for solving large-scale pro...
research
03/30/2021

Convergence on a symmetric accelerated stochastic ADMM with larger stepsizes

In this paper, we develop a symmetric accelerated stochastic Alternating...
research
05/18/2017

Asynchronous parallel primal-dual block update methods

Recent several years have witnessed the surge of asynchronous (async-) p...
research
08/13/2016

Hybrid Jacobian and Gauss-Seidel proximal block coordinate update methods for linearly constrained convex programming

Recent years have witnessed the rapid development of block coordinate up...
research
12/21/2022

Efficient First-order Methods for Convex Optimization with Strongly Convex Function Constraints

Convex function constrained optimization has received growing research i...
research
12/23/2009

Fast Alternating Linearization Methods for Minimizing the Sum of Two Convex Functions

We present in this paper first-order alternating linearization algorithm...

Please sign up or login with your details

Forgot password? Click here to reset