A Short Note of PAGE: Optimal Convergence Rates for Nonconvex Optimization

06/17/2021
by   Zhize Li, et al.
8

In this note, we first recall the nonconvex problem setting and introduce the optimal PAGE algorithm (Li et al., ICML'21). Then we provide a simple and clean convergence analysis of PAGE for achieving optimal convergence rates. Moreover, PAGE and its analysis can be easily adopted and generalized to other works. We hope that this note provides the insights and is helpful for future works.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/05/2020

Change Rate Estimation and Optimal Freshness in Web Page Crawling

For providing quick and accurate results, a search engine maintains a lo...
08/28/2019

The Rise and Fall of the Note: Changing Paper Lengths in ACM CSCW, 2000-2018

In this note, I quantitatively examine various trends in the lengths of ...
08/25/2020

PAGE: A Simple and Optimal Probabilistic Gradient Estimator for Nonconvex Optimization

In this paper, we propose a novel stochastic gradient estimator—ProbAbil...
11/12/2021

Fully Automatic Page Turning on Real Scores

We present a prototype of an automatic page turning system that works di...
05/17/2022

On the Convergence of Policy in Unregularized Policy Mirror Descent

In this short note, we give the convergence analysis of the policy in th...
06/10/2015

Convergence rates for pretraining and dropout: Guiding learning parameters using network structure

Unsupervised pretraining and dropout have been well studied, especially ...
09/17/2020

Online Algorithms for Estimating Change Rates of Web Pages

For providing quick and accurate search results, a search engine maintai...