Ridge Regularized Estimation of VAR Models for Inference and Sieve Approximation

05/03/2021
by   Giovanni Ballarin, et al.
0

Developments in statistical learning have fueled the analysis of high-dimensional time series. However, even in low-dimensional contexts the issues arising from ill-conditioned regression problems are well-known. Because linear time series modeling is naturally prone to such issues, I propose to apply ridge regression to the estimation of dense VAR models. Theoretical non-asymptotic results concerning the addition of a ridge-type penalty to the least squares estimator are discussed, while standard asymptotic and inference techniques are proven to be valid under mild conditions on the regularizer. The proposed estimator is then applied to the problem of sieve approximation of VAR(∞) processes under moderately harsh sample sizes. Simulation evidence is used to discuss the small sample properties of the ridge estimator (RLS) when compared to least squares and local projection approaches: I use a Monte Carlo exercise to argue that RLS with a lag-adapted cross-validated regularizer achieve meaningfully better performance in recovering impulse response functions and asymptotic confidence intervals than other common approaches.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset