Communication-efficient sparse regression: a one-shot approach

03/14/2015
by   Jason D. Lee, et al.
0

We devise a one-shot approach to distributed sparse regression in the high-dimensional setting. The key idea is to average "debiased" or "desparsified" lasso estimators. We show the approach converges at the same rate as the lasso as long as the dataset is not split across too many machines. We also extend the approach to generalized linear models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/28/2023

Debiased Lasso After Sample Splitting for Estimation and Inference in High Dimensional Generalized Linear Models

We consider random sample splitting for estimation and inference in high...
research
04/09/2012

Non-asymptotic Oracle Inequalities for the High-Dimensional Cox Regression via Lasso

We consider the finite sample properties of the regularized high-dimensi...
research
01/09/2023

Distributed Sparse Linear Regression under Communication Constraints

In multiple domains, statistical tasks are performed in distributed sett...
research
05/26/2016

A General Family of Trimmed Estimators for Robust High-dimensional Data Analysis

We consider the problem of robustifying high-dimensional structured esti...
research
07/14/2022

High Dimensional Generalised Penalised Least Squares

In this paper we develop inference for high dimensional linear models, w...
research
11/02/2020

c-lasso – a Python package for constrained sparse and robust regression and classification

We introduce c-lasso, a Python package that enables sparse and robust li...
research
11/13/2018

Spectral Deconfounding and Perturbed Sparse Linear Models

Standard high-dimensional regression methods assume that the underlying ...

Please sign up or login with your details

Forgot password? Click here to reset