Iterative Hessian Sketch in Input Sparsity Time

10/30/2019
by   Graham Cormode, et al.
0

Scalable algorithms to solve optimization and regression tasks even approximately, are needed to work with large datasets. In this paper we study efficient techniques from matrix sketching to solve a variety of convex constrained regression problems. We adopt "Iterative Hessian Sketching" (IHS) and show that the fast CountSketch and sparse Johnson-Lindenstrauss Transforms yield state-of-the-art accuracy guarantees under IHS, while drastically improving the time cost. As a result, we obtain significantly faster algorithms for constrained regression, for both sparse and dense inputs. Our empirical results show that we can summarize data roughly 100x faster for sparse data, and, surprisingly, 10x faster on dense data! Consequently, solutions accurate to within machine precision of the optimal solution can be found much faster than the previous state of the art.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/09/2018

Large Scale Constrained Linear Regression Revisited: Faster Algorithms via Preconditioning

In this paper, we revisit the large-scale constrained linear regression ...
research
11/03/2014

Iterative Hessian sketch: Fast and accurate solution approximation for constrained least-squares

We study randomized sketching methods for approximately solving least-sq...
research
10/29/2019

Efficient Computation for Centered Linear Regression with Sparse Inputs

Regression with sparse inputs is a common theme for large scale models. ...
research
10/16/2021

Fast Projection onto the Capped Simplex withApplications to Sparse Regression in Bioinformatics

We consider the problem of projecting a vector onto the so-called k-capp...
research
04/04/2023

The Bit Complexity of Efficient Continuous Optimization

We analyze the bit complexity of efficient algorithms for fundamental op...
research
09/28/2017

Sparse High-Dimensional Regression: Exact Scalable Algorithms and Phase Transitions

We present a novel binary convex reformulation of the sparse regression ...
research
07/15/2021

Lockout: Sparse Regularization of Neural Networks

Many regression and classification procedures fit a parameterized functi...

Please sign up or login with your details

Forgot password? Click here to reset