On the global convergence of randomized coordinate gradient descent for non-convex optimization

01/05/2021
by   Ziang Chen, et al.
0

In this work, we analyze the global convergence property of coordinate gradient descent with random choice of coordinates and stepsizes for non-convex optimization problems. Under generic assumptions, we prove that the algorithm iterate will almost surely escape strict saddle points of the objective function. As a result, the algorithm is guaranteed to converge to local minima if all saddle points are strict. Our proof is based on viewing coordinate descent algorithm as a nonlinear random dynamical system and a quantitative finite block analysis of its linearization around saddle points.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/06/2015

Escaping From Saddle Points --- Online Stochastic Gradient for Tensor Decomposition

We analyze stochastic gradient descent for optimizing non-convex functio...
research
07/28/2021

Global minimizers, strict and non-strict saddle points, and implicit regularization for deep linear neural networks

In non-convex settings, it is established that the behavior of gradient-...
research
12/07/2020

Convergence of block coordinate descent with diminishing radius for nonconvex optimization

Block coordinate descent (BCD), also known as nonlinear Gauss-Seidel, is...
research
03/06/2019

Why Learning of Large-Scale Neural Networks Behaves Like Convex Optimization

In this paper, we present some theoretical work to explain why simple gr...
research
06/11/2020

Randomized Fast Subspace Descent Methods

Randomized Fast Subspace Descent (RFASD) Methods are developed and analy...
research
11/05/2018

Task Embedded Coordinate Update: A Realizable Framework for Multivariate Non-convex Optimization

We in this paper propose a realizable framework TECU, which embeds task-...
research
11/15/2016

The Power of Normalization: Faster Evasion of Saddle Points

A commonly used heuristic in non-convex optimization is Normalized Gradi...

Please sign up or login with your details

Forgot password? Click here to reset