Sequential Recommendation with Relation-Aware Kernelized Self-Attention

11/15/2019
by   Mingi Ji, et al.
16

Recent studies identified that sequential Recommendation is improved by the attention mechanism. By following this development, we propose Relation-Aware Kernelized Self-Attention (RKSA) adopting a self-attention mechanism of the Transformer with augmentation of a probabilistic model. The original self-attention of Transformer is a deterministic measure without relation-awareness. Therefore, we introduce a latent space to the self-attention, and the latent space models the recommendation context from relation as a multivariate skew-normal distribution with a kernelized covariance matrix from co-occurrences, item characteristics, and user information. This work merges the self-attention of the Transformer and the sequential recommendation by adding a probabilistic model of the recommendation task specifics. We experimented RKSA over the benchmark datasets, and RKSA shows significant improvements compared to the recent baseline models. Also, RKSA were able to produce a latent space model that answers the reasons for recommendation.

READ FULL TEXT

page 1

page 7

research
06/08/2020

Linformer: Self-Attention with Linear Complexity

Large transformer models have shown extraordinary success in achieving s...
research
09/16/2022

Recursive Attentive Methods with Reused Item Representations for Sequential Recommendation

Sequential recommendation aims to recommend the next item of users' inte...
research
12/30/2018

Variational Self-attention Model for Sentence Representation

This paper proposes a variational self-attention model (VSAM) that emplo...
research
12/12/2022

Tensor-based Sequential Learning via Hankel Matrix Representation for Next Item Recommendations

Self-attentive transformer models have recently been shown to solve the ...
research
04/17/2020

Highway Transformer: Self-Gating Enhanced Self-Attentive Networks

Self-attention mechanisms have made striking state-of-the-art (SOTA) pro...
research
07/04/2022

Interpretable Fusion Analytics Framework for fMRI Connectivity: Self-Attention Mechanism and Latent Space Item-Response Model

There have been several attempts to use deep learning based on brain fMR...
research
03/05/2021

Non-invasive Self-attention for Side Information Fusion in Sequential Recommendation

Sequential recommender systems aim to model users' evolving interests fr...

Please sign up or login with your details

Forgot password? Click here to reset