On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization

07/10/2016
by   Xingguo Li, et al.
0

The cyclic block coordinate descent-type (CBCD-type) methods, which performs iterative updates for a few coordinates (a block) simultaneously throughout the procedure, have shown remarkable computational performance for solving strongly convex minimization problems. Typical applications include many popular statistical machine learning methods such as elastic-net regression, ridge penalized logistic regression, and sparse additive regression. Existing optimization literature has shown that for strongly convex minimization, the CBCD-type methods attain iteration complexity of O(p(1/ϵ)), where ϵ is a pre-specified accuracy of the objective value, and p is the number of blocks. However, such iteration complexity explicitly depends on p, and therefore is at least p times worse than the complexity O((1/ϵ)) of gradient descent (GD) methods. To bridge this theoretical gap, we propose an improved convergence analysis for the CBCD-type methods. In particular, we first show that for a family of quadratic minimization problems, the iteration complexity O(^2(p)·(1/ϵ)) of the CBCD-type methods matches that of the GD methods in term of dependency on p, up to a ^2 p factor. Thus our complexity bounds are sharper than the existing bounds by at least a factor of p/^2(p). We also provide a lower bound to confirm that our improved complexity bounds are tight (up to a ^2 (p) factor), under the assumption that the largest and smallest eigenvalues of the Hessian matrix do not scale with p. Finally, we generalize our analysis to other strongly convex minimization problems beyond quadratic ones.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/01/2016

Randomized block proximal damped Newton method for composite self-concordant minimization

In this paper we consider the composite self-concordant (CSC) minimizati...
research
07/14/2011

Iteration Complexity of Randomized Block-Coordinate Descent Methods for Minimizing a Composite Function

In this paper we develop a randomized block-coordinate descent method fo...
research
02/19/2020

A Unified Convergence Analysis for Shuffling-Type Gradient Methods

In this paper, we provide a unified convergence analysis for a class of ...
research
06/21/2023

Empirical Risk Minimization with Shuffled SGD: A Primal-Dual Perspective and Improved Bounds

Stochastic gradient descent (SGD) is perhaps the most prevalent optimiza...
research
10/21/2019

Relative Interior Rule in Block-Coordinate Minimization

(Block-)coordinate minimization is an iterative optimization method whic...
research
07/31/2023

Line Search for Convex Minimization

Golden-section search and bisection search are the two main principled a...
research
10/08/2013

Distributed Coordinate Descent Method for Learning with Big Data

In this paper we develop and analyze Hydra: HYbriD cooRdinAte descent me...

Please sign up or login with your details

Forgot password? Click here to reset