On Optimal and Feasible Regularization in Linear Models in Time Series

07/22/2020
by   Erez Buchweitz, et al.
0

We discuss predictive linear modeling in the presence of: (i) stochastic regressors that are not strictly but rather contemporaneously exogenous to the residual noise; (ii) autocorrelation and heteroscedasticity in the regressors and residual noise of unknown structure. The two are prevalent in time series data. In such settings, ordinary least squares (OLS) is often the preferred estimator in practice, due to its guaranteed consistency. However, it demonstrates instability related to distinctive features of the covariates, such as autocorrelation and association with random effects. In this paper, we attempt to mitigate this drawback of OLS using well-informed regularization. We show that given ideal knowledge of the covariance of an estimator, a maximum a-posteriori probability regularized estimator can be devised admitting desirable properties even when the underlying estimator is misspecified. We give particular consideration to ridge regularization, although our method is widely applicable and generalizations are discussed therein. In order for the well-regularized estimator to be employed in practice, we detail a three staged method of estimating the OLS covariance comprising estimation, shrinkage and normalization. The estimated covariance is then used to form a feasible regularized estimator that is both well-adjusted to the data and consistent.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro