Quantum-Inspired Algorithms from Randomized Numerical Linear Algebra

by   Nadiia Chepurko, et al.

We create classical (non-quantum) dynamic data structures supporting queries for recommender systems and least-squares regression that are comparable to their quantum analogues. De-quantizing such algorithms has received a flurry of attention in recent years; we obtain sharper bounds for these problems. More significantly, we achieve these improvements by arguing that the previous quantum-inspired algorithms for these problems are doing leverage or ridge-leverage score sampling in disguise. With this recognition, we are able to employ the large body of work in numerical linear algebra to obtain algorithms for these problems that are simpler and faster than existing approaches. We also consider static data structures for the above problems, and obtain close-to-optimal bounds for them. To do this, we introduce a new randomized transform, the Gaussian Randomized Hadamard Transform (GRHT). It was thought in the numerical linear algebra community that to obtain nearly-optimal bounds for various problems such as rank computation, finding a maximal linearly independent subset of columns, regression, low rank approximation, maximum matching on general graphs and linear matroid union, that one would need to resolve the main open question of Nelson and Nguyen (FOCS, 2013) regarding the logarithmic factors in existing oblivious subspace embeddings. We bypass this question, using GRHT, and obtain optimal or nearly-optimal bounds for these problems. For the fundamental problems of rank computation and finding a linearly independent subset of columns, our algorithms improve Cheung, Kwok, and Lau (JACM, 2013) and are optimal to within a constant factor and a loglog(n)-factor, respectively. Further, for constant factor regression and low rank approximation we give the first optimal algorithms, for the current matrix multiplication exponent.



There are no comments yet.


page 1

page 2

page 3

page 4


Near-Optimal Algorithms for Linear Algebra in the Current Matrix Multiplication Time

Currently, in the numerical linear algebra community, it is thought that...

Determinantal Point Processes in Randomized Numerical Linear Algebra

Randomized Numerical Linear Algebra (RandNLA) uses randomness to develop...

pylspack: Parallel algorithms and data structures for sketching, column subset selection, regression and leverage scores

We present parallel algorithms and data structures for three fundamental...

Fast Low-Rank Tensor Decomposition by Ridge Leverage Score Sampling

Low-rank tensor decomposition generalizes low-rank matrix approximation ...

Quantum-inspired algorithms in practice

We study the practical performance of quantum-inspired algorithms for re...

Sparse graph based sketching for fast numerical linear algebra

In recent years, a variety of randomized constructions of sketching matr...

Numerical Linear Algebra in the Sliding Window Model

We initiate the study of numerical linear algebra in the sliding window ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.