Sublinear Time Numerical Linear Algebra for Structured Matrices

12/12/2019
by   Xiaofei Shi, et al.
0

We show how to solve a number of problems in numerical linear algebra, such as least squares regression, ℓ_p-regression for any p ≥ 1, low rank approximation, and kernel regression, in time T(A) (log(nd)), where for a given input matrix A ∈R^n × d, T(A) is the time needed to compute A· y for an arbitrary vector y ∈R^d. Since T(A) ≤ O((A)), where (A) denotes the number of non-zero entries of A, the time is no worse, up to polylogarithmic factors, as all of the recent advances for such problems that run in input-sparsity time. However, for many applications, T(A) can be much smaller than (A), yielding significantly sublinear time algorithms. For example, in the overconstrained (1+ϵ)-approximate polynomial interpolation problem, A is a Vandermonde matrix and T(A) = O(n log n); in this case our running time is n ·(log n) + (d/ϵ) and we recover the results of <cit.> as a special case. For overconstrained autoregression, which is a common problem arising in dynamical systems, T(A) = O(n log n), and we immediately obtain n ·(log n) + (d/ϵ) time. For kernel autoregression, we significantly improve the running time of prior algorithms for general kernels. For the important case of autoregression with the polynomial kernel and arbitrary target vector b∈R^n, we obtain even faster algorithms. Our algorithms show that, perhaps surprisingly, most of these optimization problems do not require much more time than that of a polylogarithmic number of matrix-vector multiplications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/29/2019

Optimal Sketching for Kronecker Product Regression and Low Rank Approximation

We study the Kronecker product regression problem, in which the design m...
research
07/12/2021

In-Database Regression in Input Sparsity Time

Sketching is a powerful dimensionality reduction technique for accelerat...
research
09/27/2019

Total Least Squares Regression in Input Sparsity Time

In the total least squares problem, one is given an m × n matrix A, and ...
research
09/09/2018

On Solving Linear Systems in Sublinear Time

We study sublinear algorithms that solve linear systems locally. In the ...
research
12/27/2017

Sketching for Kronecker Product Regression and P-splines

TensorSketch is an oblivious linear sketch introduced in Pagh'13 and lat...
research
11/22/2017

Leverage Score Sampling for Faster Accelerated Regression and ERM

Given a matrix A∈R^n× d and a vector b ∈R^d, we show how to compute an ϵ...
research
06/11/2023

Learning the Positions in CountSketch

We consider sketching algorithms which first compress data by multiplica...

Please sign up or login with your details

Forgot password? Click here to reset