A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming

06/25/2013
by   Zhaosong Lu, et al.
0

We propose a randomized nonmonotone block proximal gradient (RNBPG) method for minimizing the sum of a smooth (possibly nonconvex) function and a block-separable (possibly nonconvex nonsmooth) function. At each iteration, this method randomly picks a block according to any prescribed probability distribution and solves typically several associated proximal subproblems that usually have a closed-form solution, until a certain progress on objective value is achieved. In contrast to the usual randomized block coordinate descent method [23,20], our method has a nonmonotone flavor and uses variable stepsizes that can partially utilize the local curvature information of the smooth component of objective function. We show that any accumulation point of the solution sequence of the method is a stationary point of the problem almost surely and the method is capable of finding an approximate stationary point with high probability. We also establish a sublinear rate of convergence for the method in terms of the minimal expected squared norm of certain proximal gradients over the iterations. When the problem under consideration is convex, we show that the expected objective values generated by RNBPG converge to the optimal value of the problem. Under some assumptions, we further establish a sublinear and linear rate of convergence on the expected objective values generated by a monotone version of RNBPG. Finally, we conduct some preliminary experiments to test the performance of RNBPG on the ℓ_1-regularized least-squares problem and a dual SVM problem in machine learning. The computational results demonstrate that our method substantially outperforms the randomized block coordinate descent method with fixed or variable stepsizes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/21/2013

On the Complexity Analysis of Randomized Block-Coordinate Descent Methods

In this paper we analyze the randomized block-coordinate descent (RBCD) ...
research
01/14/2022

Convergence of an Asynchronous Block-Coordinate Forward-Backward Algorithm for Convex Composite Optimization

In this paper, we study the convergence properties of a randomized block...
research
09/03/2019

Efficiency of Coordinate Descent Methods For Structured Nonconvex Optimization

Novel coordinate descent (CD) methods are proposed for minimizing noncon...
research
06/04/2023

Complexity of Block Coordinate Descent with Proximal Regularization and Applications to Wasserstein CP-dictionary Learning

We consider the block coordinate descent methods of Gauss-Seidel type wi...
research
01/06/2023

A Levenberg-Marquardt Method for Nonsmooth Regularized Least Squares

We develop a Levenberg-Marquardt method for minimizing the sum of a smoo...
research
11/18/2017

Proximal Gradient Method with Extrapolation and Line Search for a Class of Nonconvex and Nonsmooth Problems

In this paper, we consider a class of possibly nonconvex, nonsmooth and ...
research
08/13/2016

Hybrid Jacobian and Gauss-Seidel proximal block coordinate update methods for linearly constrained convex programming

Recent years have witnessed the rapid development of block coordinate up...

Please sign up or login with your details

Forgot password? Click here to reset