A refined convergence analysis of pDCA_e with applications to simultaneous sparse recovery and outlier detection

04/19/2018
by   Tianxiang Liu, et al.
0

We consider the problem of minimizing a difference-of-convex (DC) function, which can be written as the sum of a smooth convex function with Lipschitz gradient, a proper closed convex function and a continuous possibly nonsmooth concave function. We refine the convergence analysis in [38] for the proximal DC algorithm with extrapolation (pDCA_e) and show that the whole sequence generated by the algorithm is convergent when the objective is level-bounded, without imposing differentiability assumptions in the concave part. Our analysis is based on a new potential function and we assume such a function is a Kurdyka-Łojasiewicz (KL) function. We also establish a relationship between our KL assumption and the one used in [38]. Finally, we demonstrate how the pDCA_e can be applied to a class of simultaneous sparse recovery and outlier detection problems arising from robust compressed sensing in signal processing and least trimmed squares regression in statistics. Specifically, we show that the objectives of these problems can be written as level-bounded DC functions whose concave parts are typically nonsmooth. Moreover, for a large class of loss functions and regularizers, the KL exponent of the corresponding potential function are shown to be 1/2, which implies that the pDCA_e is locally linearly convergent when applied to these problems. Our numerical experiments show that the pDCA_e usually outperforms the proximal DC algorithm with nonmonotone linesearch [24, Appendix A] in both CPU time and solution quality for this particular application.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/29/2023

An inexact linearized proximal algorithm for a class of DC composite optimization problems and applications

This paper is concerned with a class of DC composite optimization proble...
research
02/09/2016

Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods

In this paper, we study the Kurdyka-Łojasiewicz (KL) exponent, an import...
research
12/31/2015

Linear Convergence of Proximal Gradient Algorithm with Extrapolation for a Class of Nonconvex Nonsmooth Minimization Problems

In this paper, we study the proximal gradient algorithm with extrapolati...
research
04/30/2021

A Refined Inertial DCA for DC Programming

We consider the difference-of-convex (DC) programming problems whose obj...
research
12/14/2018

The Boosted DC Algorithm for nonsmooth functions

The Boosted Difference of Convex functions Algorithm (BDCA) was recently...
research
10/22/2018

On DC based Methods for Phase Retrieval

In this paper, we develop a new computational approach which is based on...
research
01/27/2013

An Extragradient-Based Alternating Direction Method for Convex Minimization

In this paper, we consider the problem of minimizing the sum of two conv...

Please sign up or login with your details

Forgot password? Click here to reset