Effective Dimension Adaptive Sketching Methods for Faster Regularized Least-Squares Optimization

06/10/2020
by   Jonathan Lacotte, et al.
3

We propose a new randomized algorithm for solving L2-regularized least-squares problems based on sketching. We consider two of the most popular random embeddings, namely, Gaussian embeddings and the Subsampled Randomized Hadamard Transform (SRHT). While current randomized solvers for least-squares optimization prescribe an embedding dimension at least greater than the data dimension, we show that the embedding dimension can be reduced to the effective dimension of the optimization problem, and still preserve high-probability convergence guarantees. In this regard, we derive sharp matrix deviation inequalities over ellipsoids for both Gaussian and SRHT embeddings. Specifically, we improve on the constant of a classical Gaussian concentration bound whereas, for SRHT embeddings, our deviation inequality involves a novel technical approach. Leveraging these bounds, we are able to design a practical and adaptive algorithm which does not require to know the effective dimension beforehand. Our method starts with an initial embedding dimension equal to 1 and, over iterations, increases the embedding dimension up to the effective one. Finally, we prove that our algorithm improves the state-of-the-art computational complexity for solving regularized least-squares problems. Further, we show numerically that it outperforms standard least-squares solvers such as the conjugate gradient method and its pre-conditioned version on several standard machine learning datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/21/2020

Optimal Randomized First-Order Methods for Least-Squares Problems

We provide an exact analysis of a class of randomized algorithms for sol...
research
11/06/2019

Faster Least Squares Optimization

We investigate randomized methods for solving overdetermined linear leas...
research
04/29/2021

Fast Convex Quadratic Optimization Solvers with Adaptive Sketching-based Preconditioners

We consider least-squares problems with quadratic regularization and pro...
research
01/21/2022

Extended Randomized Kaczmarz Method for Sparse Least Squares and Impulsive Noise Problems

The Extended Randomized Kaczmarz method is a well known iterative scheme...
research
12/07/2019

Regularized Momentum Iterative Hessian Sketch for Large Scale Linear System of Equations

In this article, Momentum Iterative Hessian Sketch (M-IHS) techniques, a...
research
10/16/2015

Robust Partially-Compressed Least-Squares

Randomized matrix compression techniques, such as the Johnson-Lindenstra...
research
02/04/2021

Concentration of Non-Isotropic Random Tensors with Applications to Learning and Empirical Risk Minimization

Dimension is an inherent bottleneck to some modern learning tasks, where...

Please sign up or login with your details

Forgot password? Click here to reset