Exact asymptotic characterisation of running time for approximate gradient descent on random graphs

05/10/2021
by   Matthieu Jonckheere, et al.
0

In this work we study the time complexity for the search of local minima in random graphs whose vertices have i.i.d. cost values. We show that, for Erdös-Rényi graphs with connection probability given by λ/n^α (with λ > 0 and 0 < α < 1), a family of local algorithms that approximate a gradient descent find local minima faster than the full gradient descent. Furthermore, we find a probabilistic representation for the running time of these algorithms leading to asymptotic estimates of the mean running times.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/14/2018

Secondary gradient descent in higher codimension

In this paper, we analyze discrete gradient descent and ϵ-noisy gradient...
research
01/09/2019

The Lingering of Gradients: How to Reuse Gradients over Time

Classically, the time complexity of a first-order method is estimated by...
research
11/03/2016

Finding Approximate Local Minima Faster than Gradient Descent

We design a non-convex second-order optimization algorithm that is guara...
research
09/05/2021

On the dependence between a Wiener process and its running maxima and running minima processes

We study a triple of stochastic processes: a Wiener process W_t, t ≥ 0, ...
research
12/05/2021

A Novel Sequential Coreset Method for Gradient Descent Algorithms

A wide range of optimization problems arising in machine learning can be...
research
10/26/2018

An Acceleration Scheme to The Local Directional Pattern

This study seeks to improve the running time of the Local Directional Pa...
research
08/03/2018

Coordinate Methods for Accelerating ℓ_∞ Regression and Faster Approximate Maximum Flow

In this paper we provide faster algorithms for approximately solving ℓ_∞...

Please sign up or login with your details

Forgot password? Click here to reset