A Short Note of PAGE: Optimal Convergence Rates for Nonconvex Optimization

06/17/2021
by   Zhize Li, et al.
8

In this note, we first recall the nonconvex problem setting and introduce the optimal PAGE algorithm (Li et al., ICML'21). Then we provide a simple and clean convergence analysis of PAGE for achieving optimal convergence rates. Moreover, PAGE and its analysis can be easily adopted and generalized to other works. We hope that this note provides the insights and is helpful for future works.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/05/2020

Change Rate Estimation and Optimal Freshness in Web Page Crawling

For providing quick and accurate results, a search engine maintains a lo...
research
08/28/2019

The Rise and Fall of the Note: Changing Paper Lengths in ACM CSCW, 2000-2018

In this note, I quantitatively examine various trends in the lengths of ...
research
08/25/2020

PAGE: A Simple and Optimal Probabilistic Gradient Estimator for Nonconvex Optimization

In this paper, we propose a novel stochastic gradient estimator—ProbAbil...
research
11/12/2021

Fully Automatic Page Turning on Real Scores

We present a prototype of an automatic page turning system that works di...
research
07/01/2023

A note concerning polyhyperbolic and related splines

This note concerns the finite interpolation problem with two parametrize...
research
05/17/2022

On the Convergence of Policy in Unregularized Policy Mirror Descent

In this short note, we give the convergence analysis of the policy in th...
research
06/10/2015

Convergence rates for pretraining and dropout: Guiding learning parameters using network structure

Unsupervised pretraining and dropout have been well studied, especially ...

Please sign up or login with your details

Forgot password? Click here to reset