Faster Least Squares Optimization

11/06/2019
by   Jonathan Lacotte, et al.
0

We investigate randomized methods for solving overdetermined linear least-squares problems, where the Hessian is approximated based on a random projection of the data matrix. We consider a random subspace embedding which is either drawn at the beginning and then fixed, or, refreshed at each iteration. We provide an exact finite-time analysis of the refreshed embeddings method for a broad class of random matrices, an exact asymptotic analysis of the fixed embedding method with a Gaussian matrix, and a non-asymptotic analysis of the fixed embedding method for Gaussian and SRHT matrices, with and without momentum acceleration. Surprisingly, we show that, for Gaussian matrices, the refreshed sketching method with no momentum yields the same asymptotic rate of convergence as the fixed embedding method accelerated with momentum. Furthermore, we characterize optimal step sizes and prove that, for a broad class of random matrices including the Gaussian ensemble, momentum does not accelerate the refreshed embeddings method. Hence, among the class of randomized algorithms we consider, a fixed subspace embedding with momentum yields the fastest rate of convergence, along with the lowest computational complexity. Then, picking the accelerated, fixed embedding method as the algorithm of choice, we obtain a faster algorithm by optimizing over the choice of the sketching dimension. Our choice of the sketch size yields an algorithm, for solving overdetermined least-squares problem, with a lower computational complexity compared to current state-of-the-art least-squares iterative methods based on randomized pre-conditioners. In particular, given the sketched data matrix, as the sample size grows, the resulting computational complexity becomes sub-linear in the problem dimensions. We validate numerically our guarantees on large sample datasets, both for Gaussian and SRHT embeddings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/21/2020

Optimal Randomized First-Order Methods for Least-Squares Problems

We provide an exact analysis of a class of randomized algorithms for sol...
research
06/10/2020

Effective Dimension Adaptive Sketching Methods for Faster Regularized Least-Squares Optimization

We propose a new randomized algorithm for solving L2-regularized least-s...
research
02/03/2020

Limiting Spectrum of Randomized Hadamard Transform and Optimal Iterative Sketching Methods

We provide an exact analysis of the limiting spectrum of matrices random...
research
07/28/2023

Minimal error momentum Bregman-Kaczmarz

The Bregman-Kaczmarz method is an iterative method which can solve stron...
research
07/15/2021

Newton-LESS: Sparsification without Trade-offs for the Sketched Newton Update

In second-order optimization, a potential bottleneck can be computing th...
research
02/20/2019

Adaptive Iterative Hessian Sketch via A-Optimal Subsampling

Iterative Hessian sketch (IHS) is an effective sketching method for mode...
research
01/03/2022

On randomized sketching algorithms and the Tracy-Widom law

There is an increasing body of work exploring the integration of random ...

Please sign up or login with your details

Forgot password? Click here to reset