Potential-Function Proofs for First-Order Methods

12/13/2017
by   Nikhil Bansal, et al.
0

This note discusses proofs for convergence of first-order methods based on simple potential-function arguments. We cover methods like gradient descent (for both smooth and non-smooth settings), mirror descent, and some accelerated variants.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/20/2022

A Note on the Convergence of Mirrored Stein Variational Gradient Descent under (L_0,L_1)-Smoothness Condition

In this note, we establish a descent lemma for the population limit Mirr...
research
08/02/2022

A Note on Zeroth-Order Optimization on the Simplex

We construct a zeroth-order gradient estimator for a smooth function def...
research
07/06/2014

Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent

First-order methods play a central role in large-scale machine learning....
research
02/05/2020

Combinatorial proofs of two theorems of Lutz and Stull

The purpose of this note is to give combinatorial-geometric proofs for t...
research
02/03/2020

Complexity Guarantees for Polyak Steps with Momentum

In smooth strongly convex optimization, or in the presence of Hölderian ...
research
01/27/2022

Restarted Nonconvex Accelerated Gradient Descent: No More Polylogarithmic Factor in the O(ε^-7/4) Complexity

This paper studies the accelerated gradient descent for general nonconve...
research
08/12/2020

Graph Drawing via Gradient Descent, (GD)^2

Readability criteria, such as distance or neighborhood preservation, are...

Please sign up or login with your details

Forgot password? Click here to reset