SNAP: Finding Approximate Second-Order Stationary Solutions Efficiently for Non-convex Linearly Constrained Problems

07/09/2019
by   Songtao Lu, et al.
0

This paper proposes low-complexity algorithms for finding approximate second-order stationary points (SOSPs) of problems with smooth non-convex objective and linear constraints. While finding (approximate) SOSPs is computationally intractable, we first show that generic instances of the problem can be solved efficiently. More specifically, for a generic problem instance, certain strict complementarity (SC) condition holds for all Karush-Kuhn-Tucker (KKT) solutions (with probability one). The SC condition is then used to establish an equivalence relationship between two different notions of SOSPs, one of which is computationally easy to verify. Based on this particular notion of SOSP, we design an algorithm named the Successive Negative-curvature grAdient Projection (SNAP), which successively performs either conventional gradient projection or some negative curvature based projection steps to find SOSPs. SNAP and its first-order extension SNAP^+, require O(1/ϵ^2.5) iterations to compute an (ϵ, √(ϵ))-SOSP, and their per-iteration computational complexities are polynomial in the number of constraints and problem dimension. To our knowledge, this is the first time that first-order algorithms with polynomial per-iteration complexity and global sublinear rate have been designed to find SOSPs of the important class of non-convex problems with linear constraints.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/04/2017

NEON+: Accelerated Gradient Methods for Extracting Negative Curvature for Non-Convex Optimization

Accelerated gradient (AG) methods are breakthroughs in convex optimizati...
research
10/04/2022

Zeroth-Order Negative Curvature Finding: Escaping Saddle Points without Gradients

We consider escaping saddle points of nonconvex problems where only the ...
research
09/25/2017

On Noisy Negative Curvature Descent: Competing with Gradient Descent for Faster Non-convex Optimization

The Hessian-vector product has been utilized to find a second-order stat...
research
10/29/2019

Efficiently avoiding saddle points with zero order methods: No gradients required

We consider the case of derivative-free algorithms for non-convex optimi...
research
10/25/2021

On the Second-order Convergence Properties of Random Search Methods

We study the theoretical convergence properties of random-search methods...
research
09/17/2019

Quantum algorithm for finding the negative curvature direction in non-convex optimization

We present an efficient quantum algorithm aiming to find the negative cu...
research
08/04/2021

Stochastic Subgradient Descent Escapes Active Strict Saddles

In non-smooth stochastic optimization, we establish the non-convergence ...

Please sign up or login with your details

Forgot password? Click here to reset