Coordinate Descent Methods for DC Minimization

09/09/2021
by   Ganzhao Yuan, et al.
0

Difference-of-Convex (DC) minimization, referring to the problem of minimizing the difference of two convex functions, has been found rich applications in statistical learning and studied extensively for decades. However, existing methods are primarily based on multi-stage convex relaxation, only leading to weak optimality of critical points. This paper proposes a coordinate descent method for minimizing DC functions based on sequential nonconvex approximation. Our approach iteratively solves a nonconvex one-dimensional subproblem globally, and it is guaranteed to converge to a coordinate-wise stationary point. We prove that this new optimality condition is always stronger than the critical point condition and the directional point condition when the objective function is weakly convex. For comparisons, we also include a naive variant of coordinate descent methods based on sequential convex approximation in our study. When the objective function satisfies an additional regularity condition called sharpness, coordinate descent methods with an appropriate initialization converge linearly to the optimal solution set. Also, for many applications of interest, we show that the nonconvex one-dimensional subproblem can be computed exactly and efficiently using a breakpoint searching method. We present some discussions and extensions of our proposed method. Finally, we have conducted extensive experiments on several statistical learning tasks to show the superiority of our approach. Keywords: Coordinate Descent, DC Minimization, DC Programming, Difference-of-Convex Programs, Nonconvex Optimization, Sparse Optimization, Binary Optimization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/30/2022

Coordinate Descent Methods for Fractional Minimization

We consider a class of structured fractional minimization problems, in w...
research
12/07/2020

Convergence of block coordinate descent with diminishing radius for nonconvex optimization

Block coordinate descent (BCD), also known as nonlinear Gauss-Seidel, is...
research
09/03/2019

Efficiency of Coordinate Descent Methods For Structured Nonconvex Optimization

Novel coordinate descent (CD) methods are proposed for minimizing noncon...
research
10/22/2018

On DC based Methods for Phase Retrieval

In this paper, we develop a new computational approach which is based on...
research
04/07/2023

A Block Coordinate Descent Method for Nonsmooth Composite Optimization under Orthogonality Constraints

Nonsmooth composite optimization with orthogonality constraints has a br...
research
07/01/2014

DC approximation approaches for sparse optimization

Sparse optimization refers to an optimization problem involving the zero...
research
12/16/2021

Analysis of Generalized Bregman Surrogate Algorithms for Nonsmooth Nonconvex Statistical Learning

Modern statistical applications often involve minimizing an objective fu...

Please sign up or login with your details

Forgot password? Click here to reset