Sequential Recommendation via Stochastic Self-Attention

01/16/2022
by   Ziwei Fan, et al.
2

Sequential recommendation models the dynamics of a user's previous behaviors in order to forecast the next item, and has drawn a lot of attention. Transformer-based approaches, which embed items as vectors and use dot-product self-attention to measure the relationship between items, demonstrate superior capabilities among existing sequential methods. However, users' real-world sequential behaviors are uncertain rather than deterministic, posing a significant challenge to present techniques. We further suggest that dot-product-based approaches cannot fully capture collaborative transitivity, which can be derived in item-item transitions inside sequences and is beneficial for cold start items. We further argue that BPR loss has no constraint on positive and sampled negative items, which misleads the optimization. We propose a novel STOchastic Self-Attention (STOSA) to overcome these issues. STOSA, in particular, embeds each item as a stochastic Gaussian distribution, the covariance of which encodes the uncertainty. We devise a novel Wasserstein Self-Attention module to characterize item-item position-wise relationships in sequences, which effectively incorporates uncertainty into model training. Wasserstein attentions also enlighten the collaborative transitivity learning as it satisfies triangle inequality. Moreover, we introduce a novel regularization term to the ranking loss, which assures the dissimilarity between positive and the negative items. Extensive experiments on five real-world benchmark datasets demonstrate the superiority of the proposed model over state-of-the-art baselines, especially on cold start items. The code is available in <https://github.com/zfan20/STOSA>.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/11/2021

Modeling Sequences as Distributions with Uncertainty for Sequential Recommendation

The sequential patterns within the user interactions are pivotal for rep...
research
08/18/2023

Attention Calibration for Transformer-based Sequential Recommendation

Transformer-based sequential recommendation (SR) has been booming in rec...
research
08/27/2023

Text Matching Improves Sequential Recommendation by Reducing Popularity Biases

This paper proposes Text mAtching based SequenTial rEcommendation model ...
research
05/02/2021

Augmenting Sequential Recommendation with Pseudo-Prior Items via Reversely Pre-training Transformer

Sequential Recommendation characterizes the evolving patterns by modelin...
research
01/28/2023

Mutual Wasserstein Discrepancy Minimization for Sequential Recommendation

Self-supervised sequential recommendation significantly improves recomme...
research
01/04/2023

Modeling Sequential Recommendation as Missing Information Imputation

Side information is being used extensively to improve the effectiveness ...
research
10/04/2021

HyperTeNet: Hypergraph and Transformer-based Neural Network for Personalized List Continuation

The personalized list continuation (PLC) task is to curate the next item...

Please sign up or login with your details

Forgot password? Click here to reset