# Optimal Algorithms for Linear Algebra in the Current Matrix Multiplication Time

We study fundamental problems in linear algebra, such as finding a maximal linearly independent subset of rows or columns (a basis), solving linear regression, or computing a subspace embedding. For these problems, we consider input matrices 𝐀∈ℝ^n× d with n > d. The input can be read in nnz(𝐀) time, which denotes the number of nonzero entries of 𝐀. In this paper, we show that beyond the time required to read the input matrix, these fundamental linear algebra problems can be solved in d^ω time, i.e., where ω≈ 2.37 is the current matrix-multiplication exponent. To do so, we introduce a constant-factor subspace embedding with the optimal m=𝒪(d) number of rows, and which can be applied in time 𝒪(nnz(𝐀)/α) + d^2 + αpoly(log d) for any trade-off parameter α>0, tightening a recent result by Chepurko et. al. [SODA 2022] that achieves an exp(poly(loglog n)) distortion with m=d·poly(loglog d) rows in 𝒪(nnz(𝐀)/α+d^2+α+o(1)) time. Our subspace embedding uses a recently shown property of stacked Subsampled Randomized Hadamard Transforms (SRHT), which actually increase the input dimension, to "spread" the mass of an input vector among a large number of coordinates, followed by random sampling. To control the effects of random sampling, we use fast semidefinite programming to reweight the rows. We then use our constant-factor subspace embedding to give the first optimal runtime algorithms for finding a maximal linearly independent subset of columns, regression, and leverage score sampling. To do so, we also introduce a novel subroutine that iteratively grows a set of independent rows, which may be of independent interest.

07/16/2021

### Near-Optimal Algorithms for Linear Algebra in the Current Matrix Multiplication Time

Currently, in the numerical linear algebra community, it is thought that...
07/19/2012

### The Fast Cauchy Transform and Faster Robust Linear Regression

We provide fast algorithms for overconstrained ℓ_p regression and relate...
03/30/2014

### Sharpened Error Bounds for Random Sampling Based ℓ_2 Regression

Given a data matrix X ∈ R^n× d and a response vector y ∈ R^n, suppose n>...
11/09/2020

### Quantum-Inspired Algorithms from Randomized Numerical Linear Algebra

We create classical (non-quantum) dynamic data structures supporting que...
04/13/2022

### Sketching Algorithms and Lower Bounds for Ridge Regression

We give a sketching-based iterative algorithm that computes 1+ε approxim...
12/18/2019

### Pseudospectral Shattering, the Sign Function, and Diagonalization in Nearly Matrix Multiplication Time

We exhibit a randomized algorithm which given a square n× n complex matr...
05/25/2021

### Hashing embeddings of optimal dimension, with applications to linear least squares

The aim of this paper is two-fold: firstly, to present subspace embeddin...