Inertial Regularization and Selection (IRS): Sequential Regression in High-Dimension and Sparsity

10/23/2016
by   Chitta Ranjan, et al.
0

In this paper, we develop a new sequential regression modeling approach for data streams. Data streams are commonly found around us, e.g in a retail enterprise sales data is continuously collected every day. A demand forecasting model is an important outcome from the data that needs to be continuously updated with the new incoming data. The main challenge in such modeling arises when there is a) high dimensional and sparsity, b) need for an adaptive use of prior knowledge, and/or c) structural changes in the system. The proposed approach addresses these challenges by incorporating an adaptive L1-penalty and inertia terms in the loss function, and thus called Inertial Regularization and Selection (IRS). The former term performs model selection to handle the first challenge while the latter is shown to address the last two challenges. A recursive estimation algorithm is developed, and shown to outperform the commonly used state-space models, such as Kalman Filters, in experimental studies and real data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/24/2016

Pathway Lasso: Estimate and Select Sparse Mediation Pathways with High Dimensional Mediators

In many scientific studies, it becomes increasingly important to delinea...
research
04/11/2020

Robust adaptive variable selection in ultra-high dimensional regression models based on the density power divergence loss

We consider the problem of simultaneous model selection and the estimati...
research
09/06/2018

Sequential Model Selection Method for Nonparametric Autoregression

In this paper for the first time the nonparametric autoregression estima...
research
12/06/2014

A Likelihood Ratio Framework for High Dimensional Semiparametric Regression

We propose a likelihood ratio based inferential framework for high dimen...
research
07/07/2021

Variable selection in convex quantile regression: L1-norm or L0-norm regularization?

The curse of dimensionality is a recognized challenge in nonparametric e...
research
06/27/2012

A Dantzig Selector Approach to Temporal Difference Learning

LSTD is a popular algorithm for value function approximation. Whenever t...
research
04/04/2022

Low Tree-Rank Bayesian Vector Autoregression Model

Vector autoregressions have been widely used for modeling and analysis o...

Please sign up or login with your details

Forgot password? Click here to reset