DeepAI AI Chat
Log In Sign Up

Convergence guarantees for a class of non-convex and non-smooth optimization problems

by   Koulik Khamaru, et al.

We consider the problem of finding critical points of functions that are non-convex and non-smooth. Studying a fairly broad class of such problems, we analyze the behavior of three gradient-based methods (gradient descent, proximal update, and Frank-Wolfe update). For each of these methods, we establish rates of convergence for general problems, and also prove faster rates for continuous sub-analytic functions. We also show that our algorithms can escape strict saddle points for a class of non-smooth functions, thereby generalizing known results for smooth functions. Our analysis leads to a simplification of the popular CCCP algorithm, used for optimizing functions that can be written as a difference of two convex functions. Our simplified algorithm retains all the convergence properties of CCCP, along with a significantly lower cost per iteration. We illustrate our methods and theory via applications to the problems of best subset selection, robust estimation, mixture density estimation, and shape-from-shading reconstruction.


Perturbed Proximal Descent to Escape Saddle Points for Non-convex and Non-smooth Objective Functions

We consider the problem of finding local minimizers in non-convex and no...

Inertial Block Mirror Descent Method for Non-Convex Non-Smooth Optimization

In this paper, we propose inertial versions of block coordinate descent ...

Smooth over-parameterized solvers for non-smooth structured optimization

Non-smooth optimization is a core ingredient of many imaging or machine ...

Task Embedded Coordinate Update: A Realizable Framework for Multivariate Non-convex Optimization

We in this paper propose a realizable framework TECU, which embeds task-...

Understanding Notions of Stationarity in Non-Smooth Optimization

Many contemporary applications in signal processing and machine learning...

iPiano: Inertial Proximal Algorithm for Non-Convex Optimization

In this paper we study an algorithm for solving a minimization problem c...

Gradient-free optimization of highly smooth functions: improved analysis and a new algorithm

This work studies minimization problems with zero-order noisy oracle inf...