Compressed Regression

06/04/2007
by   Shuheng Zhou, et al.
0

Recent research has studied the role of sparsity in high dimensional regression and signal reconstruction, establishing theoretical limits for recovering sparse models from sparse data. This line of work shows that ℓ_1-regularized least squares regression can accurately estimate a sparse linear model from n noisy examples in p dimensions, even if p is much larger than n. In this paper we study a variant of this problem where the original n input variables are compressed by a random linear transformation to m ≪ n examples in p dimensions, and establish conditions under which a sparse linear model can be successfully recovered from the compressed data. A primary motivation for this compression procedure is to anonymize the data and preserve privacy by revealing little information about the original data. We characterize the number of random projections that are required for ℓ_1-regularized compressed regression to identify the nonzero coefficients in the true model with probability approaching one, a property called "sparsistence." In addition, we show that ℓ_1-regularized compressed regression asymptotically predicts as well as an oracle linear model, a property called "persistence." Finally, we characterize the privacy properties of the compression procedure in information-theoretic terms, establishing upper bounds on the mutual information between the compressed and uncompressed data that decay to zero.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/02/2013

Sparse Signal Processing with Linear and Nonlinear Observations: A Unified Shannon-Theoretic Approach

We derive fundamental sample complexity bounds for recovering sparse and...
research
05/11/2021

Sketching in Bayesian High Dimensional Regression With Big Data Using Gaussian Scale Mixture Priors

Bayesian computation of high dimensional linear regression models with a...
research
07/25/2017

Compressed Sparse Linear Regression

High-dimensional sparse linear regression is a basic problem in machine ...
research
05/22/2014

Compressive Mining: Fast and Optimal Data Mining in the Compressed Domain

Real-world data typically contain repeated and periodic patterns. This s...
research
01/27/2016

Information-theoretic limits of Bayesian network structure learning

In this paper, we study the information-theoretic limits of learning the...
research
05/28/2022

Functional Linear Regression of CDFs

The estimation of cumulative distribution functions (CDF) is an importan...
research
08/06/2015

A Knowledge Gradient Policy for Sequencing Experiments to Identify the Structure of RNA Molecules Using a Sparse Additive Belief Model

We present a sparse knowledge gradient (SpKG) algorithm for adaptively s...

Please sign up or login with your details

Forgot password? Click here to reset