A Short Note of PAGE: Optimal Convergence Rates for Nonconvex Optimization

by   Zhize Li, et al.

In this note, we first recall the nonconvex problem setting and introduce the optimal PAGE algorithm (Li et al., ICML'21). Then we provide a simple and clean convergence analysis of PAGE for achieving optimal convergence rates. Moreover, PAGE and its analysis can be easily adopted and generalized to other works. We hope that this note provides the insights and is helpful for future works.


page 1

page 2

page 3

page 4


Change Rate Estimation and Optimal Freshness in Web Page Crawling

For providing quick and accurate results, a search engine maintains a lo...

The Rise and Fall of the Note: Changing Paper Lengths in ACM CSCW, 2000-2018

In this note, I quantitatively examine various trends in the lengths of ...

PAGE: A Simple and Optimal Probabilistic Gradient Estimator for Nonconvex Optimization

In this paper, we propose a novel stochastic gradient estimator—ProbAbil...

Fully Automatic Page Turning on Real Scores

We present a prototype of an automatic page turning system that works di...

On the Convergence of Policy in Unregularized Policy Mirror Descent

In this short note, we give the convergence analysis of the policy in th...

Convergence rates for pretraining and dropout: Guiding learning parameters using network structure

Unsupervised pretraining and dropout have been well studied, especially ...

Online Algorithms for Estimating Change Rates of Web Pages

For providing quick and accurate search results, a search engine maintai...