Refinement of Low Rank Approximation of a Matrix at Sub-linear Cost

06/10/2019
by   Victor Y. Pan, et al.
0

Low rank approximation (LRA) of a matrix is a hot subject of modern computations. In application to Big Data mining and analysis the input matrices are so immense that one must apply sub-linear cost algorithms, which only access and process a tiny fraction of the input entries. Under this restriction one cannot compute accurate LRA of the worst case input matrix and even of the matrices of some specific small families in our Appendix, but we recently prove that one can do this with a high probability (whp) for a random input, which is in good accordance with our tests and with more than a decade of worldwide computation of LRA at sub-linear cost by means of Cross--Approximation algorithms. A natural challenge is to complement such computation of LRA by its refinement at sub-linear cost, and we take that challenge and propose two algorithms for this task together with some recipes for a posteriori estimation of the errors of LRA at sub-linear cost.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/21/2019

Low Rank Approximation of a Matrix at Sub-linear Cost

A matrix algorithm performs at sub-linear cost if it uses much fewer flo...
research
06/11/2019

Low Rank Approximation at Sublinear Cost by Means of Subspace Sampling

Low Rank Approximation (LRA) of a matrix is a hot research subject, fund...
research
06/10/2019

Low Rank Approximation Directed by Leverage Scores and Computed at Sub-linear Cost

Low rank approximation (LRA) of a matrix is a major subject of matrix an...
research
04/13/2018

On the detection of low rank matrices in the high-dimensional regime

We address the detection of a low rank n× ndeterministic matrix X_0 from...
research
02/04/2020

Randomized Numerical Linear Algebra: Foundations Algorithms

This survey describes probabilistic algorithms for linear algebra comput...
research
02/22/2019

Streaming Low-Rank Matrix Approximation with an Application to Scientific Simulation

This paper argues that randomized linear sketching is a natural tool for...
research
02/23/2020

Sketching Transformed Matrices with Applications to Natural Language Processing

Suppose we are given a large matrix A=(a_i,j) that cannot be stored in m...

Please sign up or login with your details

Forgot password? Click here to reset