Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method

06/14/2017
by   Jelena Diakonikolas, et al.
0

We provide a novel accelerated first-order method that achieves the asymptotically optimal convergence rate for smooth functions in the first-order oracle model. To this day, Nesterov's Accelerated Gradient Descent (AGD) and variations thereof were the only methods achieving acceleration in this standard blackbox model. In contrast, our algorithm is significantly different from AGD, as it relies on a predictor-corrector approach similar to that used by Mirror-Prox and Extra-Gradient Descent in the solution of convex-concave saddle point problems. For this reason, we dub our algorithm Accelerated Extra-Gradient Descent (AXGD). Its construction is motivated by the discretization of an accelerated continuous-time dynamics using the classical method of implicit Euler discretization. Our analysis explicitly shows the effects of discretization through a conceptually novel primal-dual viewpoint. Finally, we present experiments showing that our algorithm matches the performance of Nesterov's method, while appearing more robust to noise in some cases.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/11/2018

On the Curved Geometry of Accelerated Optimization

In this work we propose a differential geometric motivation for Nesterov...
research
06/11/2020

IDEAL: Inexact DEcentralized Accelerated Augmented Lagrangian Method

We introduce a framework for designing primal methods under the decentra...
research
03/14/2016

A Variational Perspective on Accelerated Methods in Optimization

Accelerated gradient methods play a central role in optimization, achiev...
research
07/07/2021

Fast and Accurate Optimization of Metasurfaces with Gradient Descent and the Woodbury Matrix Identity

A fast metasurface optimization strategy for finite-size metasurfaces mo...
research
05/09/2023

Accelerated gradient descent method for functionals of probability measures by new convexity and smoothness based on transport maps

We consider problems of minimizing functionals ℱ of probability measures...
research
05/30/2019

Implicit Regularization of Accelerated Methods in Hilbert Spaces

We study learning properties of accelerated gradient descent methods for...

Please sign up or login with your details

Forgot password? Click here to reset