Adaptive Stochastic Gradient Descent on the Grassmannian for Robust Low-Rank Subspace Recovery and Clustering

12/12/2014
by   Jun He, et al.
0

In this paper, we present GASG21 (Grassmannian Adaptive Stochastic Gradient for L_2,1 norm minimization), an adaptive stochastic gradient algorithm to robustly recover the low-rank subspace from a large matrix. In the presence of column outliers, we reformulate the batch mode matrix L_2,1 norm minimization with rank constraint problem as a stochastic optimization approach constrained on Grassmann manifold. For each observed data vector, the low-rank subspace S is updated by taking a gradient step along the geodesic of Grassmannian. In order to accelerate the convergence rate of the stochastic gradient method, we choose to adaptively tune the constant step-size by leveraging the consecutive gradients. Furthermore, we demonstrate that with proper initialization, the K-subspaces extension, K-GASG21, can robustly cluster a large number of corrupted data vectors into a union of subspaces. Numerical experiments on synthetic and real data demonstrate the efficiency and accuracy of the proposed algorithms even with heavy column outliers corruption.

READ FULL TEXT

page 2

page 9

research
11/05/2014

Global Convergence of Stochastic Gradient Descent for Some Non-convex Matrix Problems

Stochastic gradient descent (SGD) on a low-rank factorization is commonl...
research
02/18/2017

Riemannian stochastic variance reduced gradient

Stochastic variance reduction algorithms have recently become popular fo...
research
11/05/2022

Stochastic Variance Reduced Gradient for affine rank minimization problem

We develop an efficient stochastic variance reduced gradient descent alg...
research
07/23/2020

Online Robust and Adaptive Learning from Data Streams

In online learning from non-stationary data streams, it is both necessar...
research
02/04/2019

Adaptive stochastic gradient algorithms on Riemannian manifolds

Adaptive stochastic gradient algorithms in the Euclidean space have attr...
research
11/18/2016

Robust and Scalable Column/Row Sampling from Corrupted Big Data

Conventional sampling techniques fall short of drawing descriptive sketc...
research
12/08/2022

A Novel Stochastic Gradient Descent Algorithm for Learning Principal Subspaces

Many machine learning problems encode their data as a matrix with a poss...

Please sign up or login with your details

Forgot password? Click here to reset