Information Theoretic Limits for Linear Prediction with Graph-Structured Sparsity

01/26/2017
by   Adarsh Barik, et al.
0

We analyze the necessary number of samples for sparse vector recovery in a noisy linear prediction setup. This model includes problems such as linear regression and classification. We focus on structured graph models. In particular, we prove that sufficient number of samples for the weighted graph model proposed by Hegde and others is also necessary. We use the Fano's inequality on well constructed ensembles as our main tool in establishing information theoretic lower bounds.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/16/2018

Information Theoretic Limits for Standard and One-Bit Compressed Sensing with Graph-Structured Sparsity

In this paper, we analyze the information theoretic lower bound on the n...
research
01/27/2016

Information-theoretic limits of Bayesian network structure learning

In this paper, we study the information-theoretic limits of learning the...
research
03/31/2020

Information-Theoretic Lower Bounds for Zero-Order Stochastic Gradient Estimation

In this paper we analyze the necessary number of samples to estimate the...
research
02/12/2018

Region Detection in Markov Random Fields: Gaussian Case

In this work we consider the problem of model selection in Gaussian Mark...
research
04/06/2017

On the Statistical Efficiency of Compositional Nonparametric Prediction

In this paper, we propose a compositional nonparametric method in which ...
research
07/18/2012

On the Statistical Efficiency of ℓ_1,p Multi-Task Learning of Gaussian Graphical Models

In this paper, we present ℓ_1,p multi-task structure learning for Gaussi...
research
02/10/2023

Graph-Theoretic Analyses and Model Reduction for an Open Jackson Queueing Network

A graph-theoretic analysis of the steady-state behavior of an open Jacks...

Please sign up or login with your details

Forgot password? Click here to reset