Convergence of a Grassmannian Gradient Descent Algorithm for Subspace Estimation From Undersampled Data

10/01/2016
by   Dejiao Zhang, et al.
0

Subspace learning and matrix factorization problems have a great many applications in science and engineering, and efficient algorithms are critical as dataset sizes continue to grow. Many relevant problem formulations are non-convex, and in a variety of contexts it has been observed that solving the non-convex problem directly is not only efficient but reliably accurate. We discuss convergence theory for a particular method: first order incremental gradient descent constrained to the Grassmannian. The output of the algorithm is an orthonormal basis for a d-dimensional subspace spanned by an input streaming data matrix. We study two sampling cases: where each data vector of the streaming matrix is fully sampled, or where it is undersampled by a sampling matrix A_t∈^m× n with m≪ n. We propose an adaptive stepsize scheme that depends only on the sampled data and algorithm outputs. We prove that with fully sampled data, the stepsize scheme maximizes the improvement of our convergence metric at each iteration, and this method converges from any random initialization to the true subspace, despite the non-convex formulation and orthogonality constraints. For the case of undersampled data, we establish monotonic improvement on the defined convergence metric for each iteration with high probability.

READ FULL TEXT

page 13

page 14

research
06/24/2015

Global Convergence of a Grassmannian Gradient Descent Algorithm for Subspace Estimation

It has been observed in a variety of contexts that gradient descent meth...
research
06/13/2017

A Well-Tempered Landscape for Non-convex Robust Subspace Recovery

We present a mathematical analysis of a non-convex energy landscape for ...
research
05/26/2023

Fast and Minimax Optimal Estimation of Low-Rank Matrices via Non-Convex Gradient Descent

We study the problem of estimating a low-rank matrix from noisy measurem...
research
09/04/2023

Asymmetric matrix sensing by gradient descent with small random initialization

We study matrix sensing, which is the problem of reconstructing a low-ra...
research
06/11/2020

Randomized Fast Subspace Descent Methods

Randomized Fast Subspace Descent (RFASD) Methods are developed and analy...
research
12/12/2018

Gradient Descent Happens in a Tiny Subspace

We show that in a variety of large-scale deep learning scenarios the gra...
research
06/27/2019

High-Dimensional Optimization in Adaptive Random Subspaces

We propose a new randomized optimization method for high-dimensional pro...

Please sign up or login with your details

Forgot password? Click here to reset