DeepAI AI Chat
Log In Sign Up

From Proximal Point Method to Nesterov's Acceleration

by   Kwangjun Ahn, et al.

The proximal point method (PPM) is a fundamental method in optimization that is often used as a building block for fast optimization algorithms. In this work, building on a recent work by Defazio (2019), we provide a complete understanding of Nesterov's accelerated gradient method (AGM) by establishing quantitative and analytical connections between PPM and AGM. The main observation in this paper is that AGM is in fact equal to a simple approximation of PPM, which results in an elementary derivation of the mysterious updates of AGM as well as its step sizes. This connection also leads to a conceptually simple analysis of AGM based on the standard analysis of PPM. This view naturally extends to the strongly convex case and also motivates other accelerated methods for practically relevant settings.


page 1

page 2

page 3

page 4


Distributed Proximal Splitting Algorithms with Rates and Acceleration

We analyze several generic proximal splitting algorithms well suited for...

On the Curved Geometry of Accelerated Optimization

In this work we propose a differential geometric motivation for Nesterov...

A Generic Acceleration Framework for Stochastic Composite Optimization

In this paper, we introduce various mechanisms to obtain accelerated fir...

A Riemannian Accelerated Proximal Extragradient Framework and its Implications

The study of accelerated gradient methods in Riemannian optimization has...

Accelerated, Optimal, and Parallel: Some Results on Model-Based Stochastic Optimization

We extend the Approximate-Proximal Point (aProx) family of model-based m...

The Incremental Proximal Method: A Probabilistic Perspective

In this work, we highlight a connection between the incremental proximal...