DeepAI AI Chat
Log In Sign Up

Sparse Volterra and Polynomial Regression Models: Recoverability and Estimation

by   Vassilis Kekatos, et al.

Volterra and polynomial regression models play a major role in nonlinear system identification and inference tasks. Exciting applications ranging from neuroscience to genome-wide association analysis build on these models with the additional requirement of parsimony. This requirement has high interpretative value, but unfortunately cannot be met by least-squares based or kernel regression methods. To this end, compressed sampling (CS) approaches, already successful in linear regression settings, can offer a viable alternative. The viability of CS for sparse Volterra and polynomial models is the core theme of this work. A common sparse regression task is initially posed for the two models. Building on (weighted) Lasso-based schemes, an adaptive RLS-type algorithm is developed for sparse polynomial regressions. The identifiability of polynomial models is critically challenged by dimensionality. However, following the CS principle, when these models are sparse, they could be recovered by far fewer measurements. To quantify the sufficient number of measurements for a given level of sparsity, restricted isometry properties (RIP) are investigated in commonly met polynomial regression settings, generalizing known results for their linear counterparts. The merits of the novel (weighted) adaptive CS algorithms to sparse polynomial modeling are verified through synthetic as well as real data tests for genotype-phenotype analysis.


On regularization methods based on Rényi's pseudodistances for sparse high-dimensional linear regression models

Several regularization methods have been considered over the last decade...

Are Latent Factor Regression and Sparse Regression Adequate?

We propose the Factor Augmented sparse linear Regression Model (FARM) th...

On the Fundamental Limits of Recovering Tree Sparse Vectors from Noisy Linear Measurements

Recent breakthrough results in compressive sensing (CS) have established...

On Computationally Tractable Selection of Experiments in Measurement-Constrained Regression Models

We derive computationally tractable methods to select a small subset of ...

Greedy Algorithms for Hybrid Compressed Sensing

Compressed sensing (CS) is a technique which uses fewer measurements tha...

Sparse Polynomial Chaos Expansions: Literature Survey and Benchmark

Sparse polynomial chaos expansions are a popular surrogate modelling met...

Polynomial Regression As an Alternative to Neural Nets

Despite the success of neural networks (NNs), there is still a concern a...