Regularized asymptotic descents for nonconvex optimization

04/05/2020
by   Xiaopeng Luo, et al.
0

In this paper we propose regularized asymptotic descent (RAD) methods for solving nonconvex optimization problems. Our motivation is first to apply the regularized iteration and then to use an explicit asymptotic formula to approximate the solution of each regularized minimization. We consider a class of possibly nonconvex, nonsmooth, or even discontinuous objectives extended from strongly convex functions with Lipschitz-continuous gradients, in each of which has a unique global minima and is continuously differentiable at the global minimizer. The main theoretical result shows that the RAD method enjoys the global linear convergence with high probability for such a class of nonconvex objectives, i.e., the method will not be trapped in saddle points, local minima, or even discontinuities. Besides, the method is derivative-free and its per-iteration cost, i.e., the number of function evaluations, is bounded, so that it has a complexity bound O(log1/ϵ) for finding a point such that the optimality gap at this point is less than ϵ>0.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/05/2020

Regularized asymptotic descents for a class of nonconvex optimization problems

We propose and analyze regularized asymptotic descent (RAD) methods for ...
research
06/15/2020

Derivative-free global minimization for a class of multiple minima problems

We prove that the finite-difference based derivative-free descent (FD-DF...
research
08/27/2017

A Conservation Law Method in Optimization

We propose some algorithms to find local minima in nonconvex optimizatio...
research
05/25/2022

Complexity-Optimal and Curvature-Free First-Order Methods for Finding Stationary Points of Composite Optimization Problems

This paper develops and analyzes an accelerated proximal descent method ...
research
10/27/2014

A Greedy Homotopy Method for Regression with Nonconvex Constraints

Constrained least squares regression is an essential tool for high-dimen...
research
10/22/2017

Iteratively reweighted ℓ_1 algorithms with extrapolation

Iteratively reweighted ℓ_1 algorithm is a popular algorithm for solving ...
research
10/29/2020

Sparse Signal Reconstruction for Nonlinear Models via Piecewise Rational Optimization

We propose a method to reconstruct sparse signals degraded by a nonlinea...

Please sign up or login with your details

Forgot password? Click here to reset