Denoising Self-attentive Sequential Recommendation

12/08/2022
by   Huiyuan Chen, et al.
0

Transformer-based sequential recommenders are very powerful for capturing both short-term and long-term sequential item dependencies. This is mainly attributed to their unique self-attention networks to exploit pairwise item-item interactions within the sequence. However, real-world item sequences are often noisy, which is particularly true for implicit feedback. For example, a large portion of clicks do not align well with user preferences, and many products end up with negative reviews or being returned. As such, the current user action only depends on a subset of items, not on the entire sequences. Many existing Transformer-based models use full attention distributions, which inevitably assign certain credits to irrelevant items. This may lead to sub-optimal performance if Transformers are not regularized properly.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/04/2022

Coarse-to-Fine Sparse Sequential Recommendation

Sequential recommendation aims to model dynamic user behavior from histo...
research
09/20/2023

Leveraging Negative Signals with Self-Attention for Sequential Music Recommendation

Music streaming services heavily rely on their recommendation engines to...
research
04/17/2023

Attention Mixtures for Time-Aware Sequential Recommendation

Transformers emerged as powerful methods for sequential recommendation. ...
research
08/05/2023

ConvFormer: Revisiting Transformer for Sequential User Modeling

Sequential user modeling, a critical task in personalized recommender sy...
research
12/12/2022

Tensor-based Sequential Learning via Hankel Matrix Representation for Next Item Recommendations

Self-attentive transformer models have recently been shown to solve the ...
research
08/17/2021

MOI-Mixer: Improving MLP-Mixer with Multi Order Interactions in Sequential Recommendation

Successful sequential recommendation systems rely on accurately capturin...
research
08/18/2023

Attention Calibration for Transformer-based Sequential Recommendation

Transformer-based sequential recommendation (SR) has been booming in rec...

Please sign up or login with your details

Forgot password? Click here to reset