DeepAI AI Chat
Log In Sign Up

Self-attention with Functional Time Representation Learning

11/28/2019
by   Da Xu, et al.
WALMART LABS
0

Sequential modelling with self-attention has achieved cutting edge performances in natural language processing. With advantages in model flexibility, computation complexity and interpretability, self-attention is gradually becoming a key component in event sequence models. However, like most other sequence models, self-attention does not account for the time span between events and thus captures sequential signals rather than temporal patterns. Without relying on recurrent network structures, self-attention recognizes event orderings via positional encoding. To bridge the gap between modelling time-independent and time-dependent event sequence, we introduce a functional feature map that embeds time span into high-dimensional spaces. By constructing the associated translation-invariant time kernel function, we reveal the functional forms of the feature map under classic functional function analysis results, namely Bochner's Theorem and Mercer's Theorem. We propose several models to learn the functional time representation and the interactions with event representation. These methods are evaluated on real-world datasets under various continuous-time event sequence prediction tasks. The experiments reveal that the proposed methods compare favorably to baseline models while also capturing useful time-event interactions.

READ FULL TEXT

page 17

page 18

05/13/2020

Memory Controlled Sequential Self Attention for Sound Recognition

In this paper we investigate the importance of the extent of memory in s...
10/13/2022

Why self-attention is Natural for Sequence-to-Sequence Problems? A Perspective from Symmetries

In this paper, we show that structures similar to self-attention are nat...
07/03/2021

SHORING: Design Provable Conditional High-Order Interaction Network via Symbolic Testing

Deep learning provides a promising way to extract effective representati...
07/17/2019

Self-Attentive Hawkes Processes

Asynchronous events on the continuous time domain, e.g., social media ac...
06/16/2022

Event-related data conditioning for acoustic event classification

Models based on diverse attention mechanisms have recently shined in tas...
12/31/2021

Neural Hierarchical Factorization Machines for User's Event Sequence Analysis

Many prediction tasks of real-world applications need to model multi-ord...
02/03/2022

Flexible Triggering Kernels for Hawkes Process Modeling

Recently proposed encoder-decoder structures for modeling Hawkes process...