Sparse Signal Processing with Linear and Nonlinear Observations: A Unified Shannon-Theoretic Approach

by   Cem Aksoylar, et al.

We derive fundamental sample complexity bounds for recovering sparse and structured signals for linear and nonlinear observation models including sparse regression, group testing, multivariate regression and problems with missing features. In general, sparse signal processing problems can be characterized in terms of the following Markovian property. We are given a set of N variables X_1,X_2,...,X_N, and there is an unknown subset of variables S ⊂{1,...,N} that are relevant for predicting outcomes Y. More specifically, when Y is conditioned on {X_n}_n∈ S it is conditionally independent of the other variables, {X_n}_n ∈ S. Our goal is to identify the set S from samples of the variables X and the associated outcomes Y. We characterize this problem as a version of the noisy channel coding problem. Using asymptotic information theoretic analyses, we establish mutual information formulas that provide sufficient and necessary conditions on the number of samples required to successfully recover the salient variables. These mutual information expressions unify conditions for both linear and nonlinear observations. We then compute sample complexity bounds for the aforementioned models, based on the mutual information expressions in order to demonstrate the applicability and flexibility of our results in general sparse signal processing models.


page 1

page 2

page 3

page 4


Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding

Although Shannon mutual information has been widely used, its effective ...

Compressed Regression

Recent research has studied the role of sparsity in high dimensional reg...

Limits on Support Recovery with Probabilistic Models: An Information-Theoretic Framework

The support recovery problem consists of determining a sparse subset of ...

Information-theoretic limits of Bayesian network structure learning

In this paper, we study the information-theoretic limits of learning the...

A Mutual Contamination Analysis of Mixed Membership and Partial Label Models

Many machine learning problems can be characterized by mutual contaminat...

Testing noisy linear functions for sparsity

We consider the following basic inference problem: there is an unknown h...

Information Recovery from Pairwise Measurements

This paper is concerned with jointly recovering n node-variables { x_i}_...

Please sign up or login with your details

Forgot password? Click here to reset