Quantum Algorithms and Lower Bounds for Linear Regression with Norm Constraints

10/25/2021
by   Yanlin Chen, et al.
6

Lasso and Ridge are important minimization problems in machine learning and statistics. They are versions of linear regression with squared loss where the vector θ∈ℝ^d of coefficients is constrained in either ℓ_1-norm (for Lasso) or in ℓ_2-norm (for Ridge). We study the complexity of quantum algorithms for finding ε-minimizers for these minimization problems. We show that for Lasso we can get a quadratic quantum speedup in terms of d by speeding up the cost-per-iteration of the Frank-Wolfe algorithm, while for Ridge the best quantum algorithms are linear in d, as are the best classical algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/18/2012

Linear Regression with Limited Observation

We consider the most common variants of linear regression, including Rid...
research
10/04/2022

Quantum communication complexity of linear regression

Dequantized algorithms show that quantum computers do not have exponenti...
research
06/04/2021

Adiabatic Quantum Feature Selection for Sparse Linear Regression

Linear regression is a popular machine learning approach to learn and pr...
research
01/31/2022

Sparse Signal Reconstruction with QUBO Formulation in l0-regularized Linear Regression

An l0-regularized linear regression for a sparse signal reconstruction i...
research
09/01/2017

Sparse Regularization in Marketing and Economics

Sparse alpha-norm regularization has many data-rich applications in mark...
research
05/07/2020

Fractional ridge regression: a fast, interpretable reparameterization of ridge regression

Ridge regression (RR) is a regularization technique that penalizes the L...
research
06/27/2022

Quantum Regularized Least Squares

Linear regression is a widely used technique to fit linear models and fi...

Please sign up or login with your details

Forgot password? Click here to reset