Extending the step-size restriction for gradient descent to avoid strict saddle points

08/05/2019
by   Hayden Schaeffer, et al.
0

We provide larger step-size restrictions for which gradient descent based algorithms (almost surely) avoid strict saddle points. In particular, consider a twice differentiable (non-convex) objective function whose gradient has Lipschitz constant L and whose Hessian is well-behaved. We prove that the probability of initial conditions for gradient descent with step-size up to 2/L converging to a strict saddle point, given one uniformly random initialization, is zero. This extends previous results up to the sharp limit imposed by the convex case. In addition, the arguments hold in the case when a learning rate schedule is given, with either a continuous decaying rate or a piece-wise constant schedule.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/20/2019

LOSSGRAD: automatic learning rate in gradient descent

In this paper, we propose a simple, fast and easy to implement algorithm...
research
04/01/2022

Learning to Accelerate by the Methods of Step-size Planning

Gradient descent is slow to converge for ill-conditioned problems and no...
research
08/31/2023

Frank-Wolfe algorithm for DC optimization problem

In the present paper, we formulate two versions of Frank–Wolfe algorithm...
research
05/18/2020

Convergence of constant step stochastic gradient descent for non-smooth non-convex functions

This paper studies the asymptotic behavior of the constant step Stochast...
research
06/15/2023

MinMax Networks

While much progress has been achieved over the last decades in neuro-ins...
research
01/22/2020

Chirotopes of Random Points in Space are Realizable on a Small Integer Grid

We prove that with high probability, a uniform sample of n points in a c...
research
01/15/2020

Theoretical Interpretation of Learned Step Size in Deep-Unfolded Gradient Descent

Deep unfolding is a promising deep-learning technique in which an iterat...

Please sign up or login with your details

Forgot password? Click here to reset