Meta Temporal Point Processes

01/27/2023
by   Wonho Bae, et al.
0

A temporal point process (TPP) is a stochastic process where its realization is a sequence of discrete events in time. Recent work in TPPs model the process using a neural network in a supervised learning framework, where a training set is a collection of all the sequences. In this work, we propose to train TPPs in a meta learning framework, where each sequence is treated as a different task, via a novel framing of TPPs as neural processes (NPs). We introduce context sets to model TPPs as an instantiation of NPs. Motivated by attentive NP, we also introduce local history matching to help learn more informative features. We demonstrate the potential of the proposed method on popular public benchmark datasets and tasks, and compare with state-of-the-art TPP methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/18/2019

Meta-Neighborhoods

Traditional methods for training neural networks use training data just ...
research
11/03/2020

Meta-Learning for Natural Language Understanding under Continual Learning Framework

Neural network has been recognized with its accomplishments on tackling ...
research
09/04/2019

Meta Learning with Relational Information for Short Sequences

This paper proposes a new meta-learning method -- named HARMLESS (HAwkes...
research
06/06/2021

Meta-Learning Reliable Priors in the Function Space

Meta-Learning promises to enable more data-efficient inference by harnes...
research
10/19/2021

An Empirical Study: Extensive Deep Temporal Point Process

Temporal point process as the stochastic process on continuous domain of...
research
01/24/2023

Forecasting the 2016-2017 Central Apennines Earthquake Sequence with a Neural Point Process

Point processes have been dominant in modeling the evolution of seismici...
research
05/23/2023

Constant Memory Attentive Neural Processes

Neural Processes (NPs) are efficient methods for estimating predictive u...

Please sign up or login with your details

Forgot password? Click here to reset