Hybrid Jacobian and Gauss-Seidel proximal block coordinate update methods for linearly constrained convex programming

08/13/2016
by   Yangyang Xu, et al.
0

Recent years have witnessed the rapid development of block coordinate update (BCU) methods, which are particularly suitable for problems involving large-sized data and/or variables. In optimization, BCU first appears as the coordinate descent method that works well for smooth problems or those with separable nonsmooth terms and/or separable constraints. As nonseparable constraints exist, BCU can be applied under primal-dual settings. In the literature, it has been shown that for weakly convex problems with nonseparable linear constraint, BCU with fully Gauss-Seidel updating rule may fail to converge and that with fully Jacobian rule can converge sublinearly. However, empirically the method with Jacobian update is usually slower than that with Gauss-Seidel rule. To maintain their advantages, we propose a hybrid Jacobian and Gauss-Seidel BCU method for solving linearly constrained multi-block structured convex programming, where the objective may have a nonseparable quadratic term and separable nonsmooth terms. At each primal block variable update, the method approximates the augmented Lagrangian function at an affine combination of the previous two iterates, and the affinely mixing matrix with desired nice properties can be chosen through solving a semidefinite programming. We show that the hybrid method enjoys the theoretical convergence guarantee as Jacobian BCU. In addition, we numerically demonstrate that the method can perform as well as Gauss-Seidel method and better than a recently proposed randomized primal-dual BCU method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/17/2017

Accelerated Primal-Dual Proximal Block Coordinate Updating Methods for Constrained Convex Optimization

Block Coordinate Update (BCU) methods enjoy low per-update computational...
research
05/18/2017

Asynchronous parallel primal-dual block update methods

Recent several years have witnessed the surge of asynchronous (async-) p...
research
11/02/2021

Coordinate Linear Variance Reduction for Generalized Linear Programming

We study a class of generalized linear programs (GLP) in a large-scale s...
research
06/29/2016

Accelerated first-order primal-dual proximal methods for linearly constrained composite convex programming

Motivated by big data applications, first-order methods have been extrem...
research
05/19/2016

Randomized Primal-Dual Proximal Block Coordinate Updates

In this paper we propose a randomized primal-dual proximal block coordin...
research
06/25/2013

A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming

We propose a randomized nonmonotone block proximal gradient (RNBPG) meth...
research
04/23/2013

The Stochastic Gradient Descent for the Primal L1-SVM Optimization Revisited

We reconsider the stochastic (sub)gradient approach to the unconstrained...

Please sign up or login with your details

Forgot password? Click here to reset