Neon2: Finding Local Minima via First-Order Oracles

11/17/2017
by   Zeyuan Allen-Zhu, et al.
0

We propose a reduction for non-convex optimization that can (1) turn a stationary-point finding algorithm into a local-minimum finding one, and (2) replace the Hessian-vector product computations with only gradient computations. It works both in the stochastic and the deterministic settings, without hurting the algorithm's performance. As applications, our reduction turns Natasha2 into a first-order method without hurting its performance. It also converts SGD, GD, SCSG, and SVRG into local-minimum finding algorithms outperforming some best known results.

READ FULL TEXT

page 1

page 2

page 3

page 4

08/29/2017

Natasha 2: Faster Non-Convex Optimization Than SGD

We design a stochastic algorithm to train any smooth neural network to ε...
10/04/2022

Zeroth-Order Negative Curvature Finding: Escaping Saddle Points without Gradients

We consider escaping saddle points of nonconvex problems where only the ...
12/18/2017

Third-order Smoothness Helps: Even Faster Stochastic Optimization Algorithms for Finding Local Minima

We propose stochastic optimization algorithms that can find local minima...
12/19/2018

Breaking Reversibility Accelerates Langevin Dynamics for Global Non-Convex Optimization

Langevin dynamics (LD) has been proven to be a powerful technique for op...
03/04/2019

A Stochastic Trust Region Method for Non-convex Minimization

We target the problem of finding a local minimum in non-convex finite-su...
12/11/2017

Saving Gradient and Negative Curvature Computations: Finding Local Minima More Efficiently

We propose a family of nonconvex optimization algorithms that are able t...
10/30/2019

Linear Speedup in Saddle-Point Escape for Decentralized Non-Convex Optimization

Under appropriate cooperation protocols and parameter choices, fully dec...