DeepAI AI Chat
Log In Sign Up

From Proximal Point Method to Nesterov's Acceleration

05/17/2020
by   Kwangjun Ahn, et al.
0

The proximal point method (PPM) is a fundamental method in optimization that is often used as a building block for fast optimization algorithms. In this work, building on a recent work by Defazio (2019), we provide a complete understanding of Nesterov's accelerated gradient method (AGM) by establishing quantitative and analytical connections between PPM and AGM. The main observation in this paper is that AGM is in fact equal to a simple approximation of PPM, which results in an elementary derivation of the mysterious updates of AGM as well as its step sizes. This connection also leads to a conceptually simple analysis of AGM based on the standard analysis of PPM. This view naturally extends to the strongly convex case and also motivates other accelerated methods for practically relevant settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/02/2020

Distributed Proximal Splitting Algorithms with Rates and Acceleration

We analyze several generic proximal splitting algorithms well suited for...
12/11/2018

On the Curved Geometry of Accelerated Optimization

In this work we propose a differential geometric motivation for Nesterov...
06/03/2019

A Generic Acceleration Framework for Stochastic Composite Optimization

In this paper, we introduce various mechanisms to obtain accelerated fir...
11/04/2021

A Riemannian Accelerated Proximal Extragradient Framework and its Implications

The study of accelerated gradient methods in Riemannian optimization has...
01/07/2021

Accelerated, Optimal, and Parallel: Some Results on Model-Based Stochastic Optimization

We extend the Approximate-Proximal Point (aProx) family of model-based m...
07/12/2018

The Incremental Proximal Method: A Probabilistic Perspective

In this work, we highlight a connection between the incremental proximal...