ACID: A Low Dimensional Characterization of Markov-Modulated and Self-Exciting Counting Processes

by   Mark Sinzger-D'Angelo, et al.

The conditional intensity (CI) of a counting process Y_t is based on the minimal knowledge ℱ_t^Y, i.e., on the observation of Y_t alone. Prominently, the mutual information rate of a signal and its Poisson channel output is a difference functional between the CI and the intensity that has full knowledge about the input. While the CI of Markov-modulated Poisson processes evolves according to Snyder's filter, self-exciting processes, e.g., Hawkes processes, specify the CI via the history of Y_t. The emergence of the CI as a self-contained stochastic process prompts us to bring its statistical ensemble into focus. We investigate the asymptotic conditional intensity distribution (ACID) and emphasize its rich information content. We assume the case in which the CI is determined from a sufficient statistic that progresses as a Markov process. We present a simulation-free method to compute the ACID when the dimension of the sufficient statistic is low. The method is made possible by introducing a backward recurrence time parametrization, which has the advantage to align all probability inflow in a boundary condition for the master equation. Case studies illustrate the usage of ACID for three primary examples: 1) the Poisson channels with binary Markovian input (as an example of a Markov-modulated Poisson process), 2) the standard Hawkes process with exponential kernel (as an example of a self-exciting counting process) and 3) the Gamma filter (as an example of an approximate filter to a Markov-modulated Poisson process).


Markovian Transition Counting Processes: An Alternative to Markov Modulated Poisson Processes

Stochastic models for performance analysis, optimization and control of ...

Self-exciting negative binomial distribution process and critical properties of intensity distribution

We study the continuous time limit of a self-exciting negative binomial ...

A new INARMA(1, 1) model with Poisson marginals

We suggest an INARMA(1, 1) model with Poisson marginals which extends th...

Trimming the Independent Fat: Sufficient Statistics, Mutual Information, and Predictability from Effective Channel States

One of the most fundamental questions one can ask about a pair of random...

Estimating Stochastic Poisson Intensities Using Deep Latent Models

We present methodology for estimating the stochastic intensity of a doub...

Backward Simulation of Multivariate Mixed Poisson Processes

The Backward Simulation (BS) approach was developed to generate, simply ...

Prediction and Generation of Binary Markov Processes: Can a Finite-State Fox Catch a Markov Mouse?

Understanding the generative mechanism of a natural system is a vital co...

Please sign up or login with your details

Forgot password? Click here to reset