Linear Regression with Limited Observation

06/18/2012
by   Elad Hazan, et al.
0

We consider the most common variants of linear regression, including Ridge, Lasso and Support-vector regression, in a setting where the learner is allowed to observe only a fixed number of attributes of each example at training time. We present simple and efficient algorithms for these problems: for Lasso and Ridge regression they need the same total number of attributes (up to constants) as do full-information algorithms, for reaching a certain accuracy. For Support-vector regression, we require exponentially less attributes compared to the state of the art. By that, we resolve an open problem recently posed by Cesa-Bianchi et al. (2010). Experiments show the theoretical bounds to be justified by superior performance compared to the state of the art.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/23/2014

Attribute Efficient Linear Regression with Data-Dependent Sampling

In this paper we analyze a budgeted learning setting, in which the learn...
research
10/25/2021

Quantum Algorithms and Lower Bounds for Linear Regression with Norm Constraints

Lasso and Ridge are important minimization problems in machine learning ...
research
06/28/2013

Simple one-pass algorithm for penalized linear regression with cross-validation on MapReduce

In this paper, we propose a one-pass algorithm on MapReduce for penalize...
research
11/27/2020

Learning to extrapolate using continued fractions: Predicting the critical temperature of superconductor materials

In Artificial Intelligence we often seek to identify an unknown target f...
research
06/22/2021

Robust Regression Revisited: Acceleration and Improved Estimation Rates

We study fast algorithms for statistical regression problems under the s...
research
06/15/2018

Multilevel preconditioning for Ridge Regression

Solving linear systems is often the computational bottleneck in real-lif...

Please sign up or login with your details

Forgot password? Click here to reset