Neon2: Finding Local Minima via First-Order Oracles

11/17/2017
by   Zeyuan Allen-Zhu, et al.
0

We propose a reduction for non-convex optimization that can (1) turn a stationary-point finding algorithm into a local-minimum finding one, and (2) replace the Hessian-vector product computations with only gradient computations. It works both in the stochastic and the deterministic settings, without hurting the algorithm's performance. As applications, our reduction turns Natasha2 into a first-order method without hurting its performance. It also converts SGD, GD, SCSG, and SVRG into local-minimum finding algorithms outperforming some best known results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/29/2017

Natasha 2: Faster Non-Convex Optimization Than SGD

We design a stochastic algorithm to train any smooth neural network to ε...
research
10/04/2022

Zeroth-Order Negative Curvature Finding: Escaping Saddle Points without Gradients

We consider escaping saddle points of nonconvex problems where only the ...
research
12/18/2017

Third-order Smoothness Helps: Even Faster Stochastic Optimization Algorithms for Finding Local Minima

We propose stochastic optimization algorithms that can find local minima...
research
12/19/2018

Breaking Reversibility Accelerates Langevin Dynamics for Global Non-Convex Optimization

Langevin dynamics (LD) has been proven to be a powerful technique for op...
research
03/04/2019

A Stochastic Trust Region Method for Non-convex Minimization

We target the problem of finding a local minimum in non-convex finite-su...
research
12/11/2017

Saving Gradient and Negative Curvature Computations: Finding Local Minima More Efficiently

We propose a family of nonconvex optimization algorithms that are able t...
research
06/14/2023

Noise Stability Optimization for Flat Minima with Optimal Convergence Rates

We consider finding flat, local minimizers by adding average weight pert...

Please sign up or login with your details

Forgot password? Click here to reset