Transformer Hawkes Process

02/21/2020
by   Simiao Zuo, et al.
5

Modern data acquisition routinely produce massive amounts of event sequence data in various domains, such as social media, healthcare, and financial markets. These data often exhibit complicated short-term and long-term temporal dependencies. However, most of the existing recurrent neural network-based point process models fail to capture such dependencies, and yield unreliable prediction performance. To address this issue, we propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies and meanwhile enjoys computational efficiency. Numerical experiments on various datasets show that THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin. Moreover, THP is quite general and can incorporate additional structural knowledge. We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.

READ FULL TEXT
research
07/12/2019

R-Transformer: Recurrent Neural Network Enhanced Transformer

Recurrent Neural Networks have long been the dominating choice for seque...
research
12/13/2019

Spatial-Temporal Self-Attention Network for Flow Prediction

Flow prediction (e.g., crowd flow, traffic flow) with features of spatia...
research
12/02/2021

TCTN: A 3D-Temporal Convolutional Transformer Network for Spatiotemporal Predictive Learning

Spatiotemporal predictive learning is to generate future frames given a ...
research
06/08/2023

Sequence-to-Sequence Model with Transformer-based Attention Mechanism and Temporal Pooling for Non-Intrusive Load Monitoring

This paper presents a novel Sequence-to-Sequence (Seq2Seq) model based o...
research
07/05/2019

A Bi-directional Transformer for Musical Chord Recognition

Chord recognition is an important task since chords are highly abstract ...
research
09/01/2021

Complex Event Forecasting with Prediction Suffix Trees: Extended Technical Report

Complex Event Recognition (CER) systems have become popular in the past ...
research
11/08/2021

ARISE: ApeRIodic SEmi-parametric Process for Efficient Markets without Periodogram and Gaussianity Assumptions

Mimicking and learning the long-term memory of efficient markets is a fu...

Please sign up or login with your details

Forgot password? Click here to reset