Flexible Triggering Kernels for Hawkes Process Modeling

02/03/2022
by   Yamac Alican Isik, et al.
2

Recently proposed encoder-decoder structures for modeling Hawkes processes use transformer-inspired architectures, which encode the history of events via embeddings and self-attention mechanisms. These models deliver better prediction and goodness-of-fit than their RNN-based counterparts. However, they often require high computational and memory complexity requirements and sometimes fail to adequately capture the triggering function of the underlying process. So motivated, we introduce an efficient and general encoding of the historical event sequence by replacing the complex (multilayered) attention structures with triggering kernels of the observed data. Noting the similarity between the triggering kernels of a point process and the attention scores, we use a triggering kernel to replace the weights used to build history representations. Our estimate for the triggering function is equipped with a sigmoid gating mechanism that captures local-in-time triggering effects that are otherwise challenging with standard decaying-over-time kernels. Further, taking both event type representations and temporal embeddings as inputs, the model learns the underlying triggering type-time kernel parameters given pairs of event types. We present experiments on synthetic and real data sets widely used by competing models, while further including a COVID-19 dataset to illustrate a scenario where longitudinal covariates are available. Results show the proposed model outperforms existing approaches while being more efficient in terms of computational complexity and yielding interpretable results via direct application of the newly introduced kernel.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/11/2022

Learning Point Processes using Recurrent Graph Network

We present a novel Recurrent Graph Network (RGN) approach for predicting...
research
06/20/2021

Neural Spectral Marked Point Processes

Self- and mutually-exciting point processes are popular models in machin...
research
10/21/2021

Retarded kernels for longitudinal survival analysis and dynamic prediction

Predicting patient survival probabilities based on observed covariates i...
research
03/09/2019

Variational Inference of Joint Models using Multivariate Gaussian Convolution Processes

We present a non-parametric prognostic framework for individualized even...
research
11/28/2019

Self-attention with Functional Time Representation Learning

Sequential modelling with self-attention has achieved cutting edge perfo...
research
08/05/2019

Modeling Event Propagation via Graph Biased Temporal Point Process

Temporal point process is widely used for sequential data modeling. In t...
research
10/10/2022

FaDIn: Fast Discretized Inference for Hawkes Processes with General Parametric Kernels

Temporal point processes (TPP) are a natural tool for modeling event-bas...

Please sign up or login with your details

Forgot password? Click here to reset