Multi-Behavior Hypergraph-Enhanced Transformer for Sequential Recommendation

07/12/2022
by   Yuhao Yang, et al.
0

Learning dynamic user preference has become an increasingly important component for many online platforms (e.g., video-sharing sites, e-commerce systems) to make sequential recommendations. Previous works have made many efforts to model item-item transitions over user interaction sequences, based on various architectures, e.g., recurrent neural networks and self-attention mechanism. Recently emerged graph neural networks also serve as useful backbone models to capture item dependencies in sequential recommendation scenarios. Despite their effectiveness, existing methods have far focused on item sequence representation with singular type of interactions, and thus are limited to capture dynamic heterogeneous relational structures between users and items (e.g., page view, add-to-favorite, purchase). To tackle this challenge, we design a Multi-Behavior Hypergraph-enhanced Transformer framework (MBHT) to capture both short-term and long-term cross-type behavior dependencies. Specifically, a multi-scale Transformer is equipped with low-rank self-attention to jointly encode behavior-aware sequential patterns from fine-grained and coarse-grained levels. Additionally, we incorporate the global multi-behavior dependency into the hypergraph neural architecture to capture the hierarchical long-range item correlations in a customized manner. Experimental results demonstrate the superiority of our MBHT over various state-of-the-art recommendation solutions across different settings. Further ablation studies validate the effectiveness of our model design and benefits of the new MBHT framework. Our implementation code is released at: https://github.com/yuh-yang/MBHT-KDD22.

READ FULL TEXT
research
06/06/2022

Multi-Behavior Sequential Recommendation with Temporal Graph Transformer

Modeling time-evolving preferences of users with their sequential item i...
research
07/12/2019

R-Transformer: Recurrent Neural Network Enhanced Transformer

Recurrent Neural Networks have long been the dominating choice for seque...
research
08/05/2023

ConvFormer: Revisiting Transformer for Sequential User Modeling

Sequential user modeling, a critical task in personalized recommender sy...
research
10/04/2021

HyperTeNet: Hypergraph and Transformer-based Neural Network for Personalized List Continuation

The personalized list continuation (PLC) task is to curate the next item...
research
05/15/2019

Behavior Sequence Transformer for E-commerce Recommendation in Alibaba

Deep learning based methods have been widely used in industrial recommen...
research
04/14/2023

Learning Graph ODE for Continuous-Time Sequential Recommendation

Sequential recommendation aims at understanding user preference by captu...
research
08/20/2018

Next Item Recommendation with Self-Attention

In this paper, we propose a novel sequence-aware recommendation model. O...

Please sign up or login with your details

Forgot password? Click here to reset