Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent

07/06/2014
by   Zeyuan Allen-Zhu, et al.
0

First-order methods play a central role in large-scale machine learning. Even though many variations exist, each suited to a particular problem, almost all such methods fundamentally rely on two types of algorithmic steps: gradient descent, which yields primal progress, and mirror descent, which yields dual progress. We observe that the performances of gradient and mirror descent are complementary, so that faster algorithms can be designed by LINEARLY COUPLING the two. We show how to reconstruct Nesterov's accelerated gradient methods using linear coupling, which gives a cleaner interpretation than Nesterov's original proofs. We also discuss the power of linear coupling by extending it to many other settings that Nesterov's methods cannot apply to.

READ FULL TEXT

page 1

page 2

page 3

page 4

12/13/2017

Potential-Function Proofs for First-Order Methods

This note discusses proofs for convergence of first-order methods based ...
06/11/2019

Power Gradient Descent

The development of machine learning is promoting the search for fast and...
06/14/2017

Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method

We provide a novel accelerated first-order method that achieves the asym...
02/03/2021

The Instability of Accelerated Gradient Descent

We study the algorithmic stability of Nesterov's accelerated gradient me...
10/04/2020

New Insights on Learning Rules for Hopfield Networks: Memory and Objective Function Minimisation

Hopfield neural networks are a possible basis for modelling associative ...
02/24/2020

Interpolating Between Gradient Descent and Exponentiated Gradient Using Reparameterized Gradient Descent

Continuous-time mirror descent (CMD) can be seen as the limit case of th...