DeepAI AI Chat
Log In Sign Up

Sparse Volterra and Polynomial Regression Models: Recoverability and Estimation

03/03/2011
by   Vassilis Kekatos, et al.
0

Volterra and polynomial regression models play a major role in nonlinear system identification and inference tasks. Exciting applications ranging from neuroscience to genome-wide association analysis build on these models with the additional requirement of parsimony. This requirement has high interpretative value, but unfortunately cannot be met by least-squares based or kernel regression methods. To this end, compressed sampling (CS) approaches, already successful in linear regression settings, can offer a viable alternative. The viability of CS for sparse Volterra and polynomial models is the core theme of this work. A common sparse regression task is initially posed for the two models. Building on (weighted) Lasso-based schemes, an adaptive RLS-type algorithm is developed for sparse polynomial regressions. The identifiability of polynomial models is critically challenged by dimensionality. However, following the CS principle, when these models are sparse, they could be recovered by far fewer measurements. To quantify the sufficient number of measurements for a given level of sparsity, restricted isometry properties (RIP) are investigated in commonly met polynomial regression settings, generalizing known results for their linear counterparts. The merits of the novel (weighted) adaptive CS algorithms to sparse polynomial modeling are verified through synthetic as well as real data tests for genotype-phenotype analysis.

READ FULL TEXT
07/31/2020

On regularization methods based on Rényi's pseudodistances for sparse high-dimensional linear regression models

Several regularization methods have been considered over the last decade...
03/02/2022

Are Latent Factor Regression and Sparse Regression Adequate?

We propose the Factor Augmented sparse linear Regression Model (FARM) th...
06/18/2013

On the Fundamental Limits of Recovering Tree Sparse Vectors from Noisy Linear Measurements

Recent breakthrough results in compressive sensing (CS) have established...
01/09/2016

On Computationally Tractable Selection of Experiments in Measurement-Constrained Regression Models

We derive computationally tractable methods to select a small subset of ...
08/18/2019

Greedy Algorithms for Hybrid Compressed Sensing

Compressed sensing (CS) is a technique which uses fewer measurements tha...
02/04/2020

Sparse Polynomial Chaos Expansions: Literature Survey and Benchmark

Sparse polynomial chaos expansions are a popular surrogate modelling met...
06/13/2018

Polynomial Regression As an Alternative to Neural Nets

Despite the success of neural networks (NNs), there is still a concern a...