Bayesian inference on the order of stationary vector autoregressions

07/11/2023
by   Rachel L. Binks, et al.
0

Vector autoregressions (VARs) have an associated order p; conditional on observations at the preceding p time points, the variable at time t is conditionally independent of all the earlier history. Learning the order of the model is therefore vital for its characterisation and subsequent use in forecasting. It is common to assume that a VAR is stationary. This prevents the predictive variance of the process from increasing without bound as the forecast horizon increases and facilitates interpretation of the relationships between variables. A VAR is stable if and only if the roots of its characteristic equation lie outside the unit circle, constraining the autoregressive coefficient matrices to lie in the stationary region. Unfortunately, the geometry of the stationary region is very complicated which impedes specification of a prior. In this work, the autoregressive coefficients are mapped to a set of transformed partial autocorrelation matrices which are unconstrained, allowing for straightforward prior specification, routine computational inference, and meaningful interpretation of the magnitude of the elements in the matrix. The multiplicative gamma process is used to build a prior for the unconstrained matrices, which encourages increasing shrinkage of the partial autocorrelation parameters as the lag increases. Identifying the lag beyond which the partial autocorrelations become equal to zero then determines the order of the process. Posterior inference is performed using Hamiltonian Monte Carlo via Stan. A truncation criterion is used to determine whether a partial autocorrelation matrix has been effectively shrunk to zero. The value of the truncation threshold is motivated by classical theory on the sampling distribution of the partial autocorrelation function. The work is applied to neural activity data in order to investigate ultradian rhythms in the brain.

READ FULL TEXT

page 21

page 22

page 23

page 24

page 25

page 26

research
04/20/2020

Enforcing stationarity through the prior in vector autoregressions

Stationarity is a very common assumption in time series analysis. A vect...
research
02/11/2020

Generalized Poisson Difference Autoregressive Processes

This paper introduces a new stochastic process with values in the set Z ...
research
04/15/2021

Estimation of the Parameters of Vector Autoregressive (VAR) Time Series Model with Symmetric Stable Noise

In this article, we propose the fractional lower order covariance method...
research
10/26/2019

Adaptive Bayesian Spectral Analysis of High-dimensional Nonstationary Time Series

This article introduces a nonparametric approach to spectral analysis of...
research
11/03/2022

Unit-Weibull Autoregressive Moving Average Models

In this work we introduce the class of unit-Weibull Autoregressive Movin...
research
02/19/2020

The Cointegrated VAR without Unit Roots: Representation Theory and Asymptotics

It has been known since Elliott (1998) that efficient methods of inferen...

Please sign up or login with your details

Forgot password? Click here to reset