Convergence of Krasulina Scheme

08/28/2018
by   Jiangning Chen, et al.
0

Principal component analysis (PCA) is one of the most commonly used statistical procedures with a wide range of applications. Consider the points X_1, X_2,..., X_n are vectors drawn i.i.d. from a distribution with mean zero and covariance Σ, where Σ is unknown. Let A_n = X_nX_n^T, then E[A_n] = Σ. This paper consider the problem of finding the least eigenvalue and eigenvector of matrix Σ. A classical such estimator are due to Krasulinakrasulina_method_1969. We are going to state the convergence proof of Krasulina for the least eigenvalue and corresponding eigenvector, and then find their convergence rate.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/18/2008

Decomposable Principal Component Analysis

We consider principal component analysis (PCA) in decomposable Gaussian ...
research
04/28/2022

Derivation of Learning Rules for Coupled Principal Component Analysis in a Lagrange-Newton Framework

We describe a Lagrange-Newton framework for the derivation of learning r...
research
01/15/2015

The Fast Convergence of Incremental PCA

We consider a situation in which we see samples in R^d drawn i.i.d. from...
research
07/26/2016

First Efficient Convergence for Streaming k-PCA: a Global, Gap-Free, and Near-Optimal Rate

We study streaming principal component analysis (PCA), that is to find, ...
research
04/15/2020

Eigendecomposition-Free Training of Deep Networks for Linear Least-Square Problems

Many classical Computer Vision problems, such as essential matrix comput...
research
06/21/2011

Residual Component Analysis

Probabilistic principal component analysis (PPCA) seeks a low dimensiona...
research
06/08/2015

Stay on path: PCA along graph paths

We introduce a variant of (sparse) PCA in which the set of feasible supp...

Please sign up or login with your details

Forgot password? Click here to reset