A Large Deviation Approach to Posterior Consistency in Dynamical Systems
In this paper, we provide asymptotic results concerning (generalized) Bayesian inference for certain dynamical systems based on a large deviation approach. Given a sequence of observations y, a class of model processes parameterized by θ∈Θ which can be characterized as a stochastic process X^θ or a measure μ_θ, and a loss function L which measures the error between y and a realization of X^θ, we specify the generalized posterior distribution π_t(θ| y). The goal of this paper is to study the asymptotic behavior of π_t(θ| y) as t →∞. In particular, we state conditions on the model family {μ_θ}_θ∈Θ and the loss function L such that the posterior distribution converges. The two conditions we require are: (1) a conditional large deviation behavior for a single X^θ, and (2) an exponential continuity condition over the model family for the map from the parameter θ to the loss incurred between X^θ and the observation sequence y. The proposed framework is quite general, we apply it to two very different classes of dynamical systems: continuous time hypermixing processes and Gibbs processes on shifts of finite type. We also show that the generalized posterior distribution concentrates asymptotically on those parameters that minimize the expected loss and a divergence term, hence proving posterior consistency.
READ FULL TEXT