Inexact Block Coordinate Descent Algorithms for Nonsmooth Nonconvex Optimization

05/10/2019
by   Yang Yang, et al.
0

In this paper, we propose an inexact block coordinate descent algorithm for large-scale nonsmooth nonconvex optimization problems. At each iteration, a particular block variable is selected and updated by solving the original optimization problem with respect to that block variable inexactly. More precisely, a local approximation of the original optimization problem is solved. The proposed algorithm has several attractive features, namely, i) high flexibility, as the approximation function only needs to be strictly convex and it does not have to be a global upper bound of the original function; ii) fast convergence, as the approximation function can be designed to exploit the problem structure at hand and the stepsize is calculated by the line search; iii) low complexity, as the approximation subproblems are much easier to solve and the line search scheme is carried out over a properly constructed differentiable function; iv) guaranteed convergence to a stationary point, even when the objective function does not have a Lipschitz continuous gradient. Interestingly, when the approximation subproblem is solved by a descent algorithm, convergence to a stationary point is still guaranteed even if the approximation subproblem is solved inexactly by terminating the descent algorithm after a finite number of iterations. These features make the proposed algorithm suitable for large-scale problems where the dimension exceeds the memory and/or the processing capability of the existing hardware. These features are also illustrated by several applications in signal processing and machine learning, for instance, network anomaly detection and phase retrieval.

READ FULL TEXT
research
06/28/2018

Successive Convex Approximation Algorithms for Sparse Signal Estimation with Nonconvex Regularizations

In this paper, we propose a successive convex approximation framework fo...
research
11/13/2017

A Parallel Best-Response Algorithm with Exact Line Search for Nonconvex Sparsity-Regularized Rank Minimization

In this paper, we propose a convergent parallel best-response algorithm ...
research
01/30/2022

Coordinate Descent Methods for Fractional Minimization

We consider a class of structured fractional minimization problems, in w...
research
04/07/2023

A Block Coordinate Descent Method for Nonsmooth Composite Optimization under Orthogonality Constraints

Nonsmooth composite optimization with orthogonality constraints has a br...
research
02/15/2023

An abstract convergence framework with application to inertial inexact forward–backward methods

In this paper we introduce a novel abstract descent scheme suited for th...
research
02/12/2019

An Enhanced SDR based Global Algorithm for Nonconvex Complex Quadratic Programs with Signal Processing Applications

In this paper, we consider a class of nonconvex complex quadratic progra...
research
09/07/2018

A Block Coordinate Ascent Algorithm for Mean-Variance Optimization

Risk management in dynamic decision problems is a primary concern in man...

Please sign up or login with your details

Forgot password? Click here to reset