Rivalry of Two Families of Algorithms for Memory-Restricted Streaming PCA

06/04/2015
by   Chun-Liang Li, et al.
0

We study the problem of recovering the subspace spanned by the first k principal components of d-dimensional data under the streaming setting, with a memory bound of O(kd). Two families of algorithms are known for this problem. The first family is based on the framework of stochastic gradient descent. Nevertheless, the convergence rate of the family can be seriously affected by the learning rate of the descent steps and deserves more serious study. The second family is based on the power method over blocks of data, but setting the block size for its existing algorithms is not an easy task. In this paper, we analyze the convergence rate of a representative algorithm with decayed learning rate (Oja and Karhunen, 1985) in the first family for the general k>1 case. Moreover, we propose a novel algorithm for the second family that sets the block sizes automatically and dynamically with faster convergence rate. We then conduct empirical studies that fairly compare the two families on real-world data. The studies reveal the advantages and disadvantages of these two families.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/28/2019

AdaOja: Adaptive Learning Rates for Streaming PCA

Oja's algorithm has been the cornerstone of streaming methods in Princip...
research
04/03/2018

Average performance analysis of the stochastic gradient method for online PCA

This paper studies the complexity of the stochastic gradient algorithm f...
research
01/22/2019

DTN: A Learning Rate Scheme with Convergence Rate of O(1/t) for SGD

We propose a novel diminishing learning rate scheme, coined Decreasing-T...
research
03/14/2017

Online Learning Rate Adaptation with Hypergradient Descent

We introduce a general method for improving the convergence rate of grad...
research
05/20/2016

Convergence of Contrastive Divergence with Annealed Learning Rate in Exponential Family

In our recent paper, we showed that in exponential family, contrastive d...
research
05/22/2019

Convergence Analyses of Online ADAM Algorithm in Convex Setting and Two-Layer ReLU Neural Network

Nowadays, online learning is an appealing learning paradigm, which is of...
research
12/28/2010

SAPFOCS: a metaheuristic based approach to part family formation problems in group technology

This article deals with Part family formation problem which is believed ...

Please sign up or login with your details

Forgot password? Click here to reset