Lasso for hierarchical polynomial models

01/21/2020
by   Hugo Maruri-Aguilar, et al.
0

In a polynomial regression model, the divisibility conditions implicit in polynomial hierarchy give way to a natural construction of constraints for the model parameters. We use this principle to derive versions of strong and weak hierarchy and to extend existing work in the literature, which at the moment is only concerned with models of degree two. We discuss how to estimate parameters in lasso using standard quadratic programming techniques and apply our proposal to both simulated data and examples from the literature. The proposed methodology compares favorably with existing techniques in terms of low validation error and model size.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/24/2023

A note on the computational complexity of the moment-SOS hierarchy for polynomial optimization

The moment-sum-of-squares (moment-SOS) hierarchy is one of the most cele...
research
07/21/2018

T-optimal design for multivariate polynomial regression using semidefinite programming

We consider T-optimal experiment design problems for discriminating mult...
research
05/29/2019

Topological Techniques in Model Selection

The LASSO is an attractive regularisation method for linear regression t...
research
05/22/2012

A lasso for hierarchical interactions

We add a set of convex constraints to the lasso to produce sparse intera...
research
10/03/2017

Bayesian Fused Lasso regression for dynamic binary networks

We propose a multinomial logistic regression model for link prediction i...
research
05/06/2020

CS-TSSOS: Correlative and term sparsity for large-scale polynomial optimization

This work proposes a new moment-SOS hierarchy, called CS-TSSOS, for solv...
research
02/23/2023

Adaptive Approximate Implicitization of Planar Parametric Curves via Weak Gradient Constraints

Converting a parametric curve into the implicit form, which is called im...

Please sign up or login with your details

Forgot password? Click here to reset