Low Rank Approximation at Sublinear Cost by Means of Subspace Sampling
Low Rank Approximation (LRA) of a matrix is a hot research subject, fundamental for Matrix and Tensor Computations and Big Data Mining and Analysis. Computations with LRA can be performed at sub-linear cost, that is, by using much fewer arithmetic operations and memory cells than an input matrix has entries. Although every sub-linear cost algorithm for LRA fails to approximate the worst case inputs, we prove that our sub-linear cost variations of a popular subspace sampling algorithm output accurate LRA of a large class of inputs. Namely, they do so with a high probability for a random input matrix that admits its LRA. In other papers we proposed and analyzed sub-linear cost algorithms for other important matrix computations. Our numerical tests are in good accordance with our formal results.
READ FULL TEXT