The Fast Cauchy Transform and Faster Robust Linear Regression

07/19/2012
by   Kenneth L. Clarkson, et al.
0

We provide fast algorithms for overconstrained ℓ_p regression and related problems: for an n× d input matrix A and vector b∈R^n, in O(nd n) time we reduce the problem _x∈R^dAx-b_p to the same problem with input matrix à of dimension s × d and corresponding b̃ of dimension s× 1. Here, à and b̃ are a coreset for the problem, consisting of sampled and rescaled rows of A and b; and s is independent of n and polynomial in d. Our results improve on the best previous algorithms when n≫ d, for all p∈[1,∞) except p=2. We also provide a suite of improved results for finding well-conditioned bases via ellipsoidal rounding, illustrating tradeoffs between running time and conditioning quality, including a one-pass conditioning algorithm for general ℓ_p problems. We also provide an empirical evaluation of implementations of our algorithms for p=1, comparing them with related algorithms. Our empirical results show that, in the asymptotic regime, the theory is a very good guide to the practical performance of these algorithms. Our algorithms use our faster constructions of well-conditioned bases for ℓ_p spaces and, for p=1, a fast subspace embedding of independent interest that we call the Fast Cauchy Transform: a distribution over matrices Π:R^nR^O(d d), found obliviously to A, that approximately preserves the ℓ_1 norms: that is, with large probability, simultaneously for all x, Ax_1 ≈Π Ax_1, with distortion O(d^2+η), for an arbitrarily small constant η>0; and, moreover, Π A can be computed in O(nd d) time. The techniques underlying our Fast Cauchy Transform include fast Johnson-Lindenstrauss transforms, low-coherence matrices, and rescaling by Cauchy random variables.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/18/2022

Optimal Algorithms for Linear Algebra in the Current Matrix Multiplication Time

We study fundamental problems in linear algebra, such as finding a maxim...
research
02/10/2015

Implementing Randomized Matrix Algorithms in Parallel and Distributed Environments

In this era of large-scale data, distributed systems built on top of clu...
research
12/31/2021

Fast ultrametric matrix-vector multiplication

We study the properties of ultrametric matrices aiming to design methods...
research
09/27/2019

Fast Fixed Dimension L2-Subspace Embeddings of Arbitrary Accuracy, With Application to L1 and L2 Tasks

We give a fast oblivious L2-embedding of A∈R^n x d to B∈R^r x d satisfyi...
research
10/04/2019

Efficient Symmetric Norm Regression via Linear Sketching

We provide efficient algorithms for overconstrained linear regression pr...
research
03/14/2022

Fast Regression for Structured Inputs

We study the ℓ_p regression problem, which requires finding 𝐱∈ℝ^d that m...
research
11/19/2019

Guarantees for the Kronecker Fast Johnson-Lindenstrauss Transform Using a Coherence and Sampling Argument

In the recent paper [Jin, Kolda Ward, arXiv:1909.04801], it is prove...

Please sign up or login with your details

Forgot password? Click here to reset