HyperTeNet: Hypergraph and Transformer-based Neural Network for Personalized List Continuation

10/04/2021
by   Vijaikumar M, et al.
0

The personalized list continuation (PLC) task is to curate the next items to user-generated lists (ordered sequence of items) in a personalized way. The main challenge in this task is understanding the ternary relationships among the interacting entities (users, items, and lists) that the existing works do not consider. Further, they do not take into account the multi-hop relationships among entities of the same type. In addition, capturing the sequential information amongst the items already present in the list also plays a vital role in determining the next relevant items that get curated. In this work, we propose HyperTeNet – a self-attention hypergraph and Transformer-based neural network architecture for the personalized list continuation task to address the challenges mentioned above. We use graph convolutions to learn the multi-hop relationship among the entities of the same type and leverage a self-attention-based hypergraph neural network to learn the ternary relationships among the interacting entities via hyperlink prediction in a 3-uniform hypergraph. Further, the entity embeddings are shared with a Transformer-based architecture and are learned through an alternating optimization procedure. As a result, this network also learns the sequential information needed to curate the next items to be added to the list. Experimental results demonstrate that HyperTeNet significantly outperforms the other state-of-the-art models on real-world datasets. Our implementation and datasets are available at https://github.com/mvijaikumar/HyperTeNet.

READ FULL TEXT

page 1

page 4

page 9

research
08/18/2023

Attention Calibration for Transformer-based Sequential Recommendation

Transformer-based sequential recommendation (SR) has been booming in rec...
research
05/20/2020

Context-Aware Learning to Rank with Self-Attention

In learning to rank, one is interested in optimising the global ordering...
research
07/12/2022

Multi-Behavior Hypergraph-Enhanced Transformer for Sequential Recommendation

Learning dynamic user preference has become an increasingly important co...
research
12/30/2019

A Hierarchical Self-Attentive Model for Recommending User-Generated Item Lists

User-generated item lists are a popular feature of many different platfo...
research
05/02/2021

Residual Enhanced Multi-Hypergraph Neural Network

Hypergraphs are a generalized data structure of graphs to model higher-o...
research
11/23/2022

Search Behavior Prediction: A Hypergraph Perspective

Although the bipartite shopping graphs are straightforward to model sear...
research
01/16/2022

Sequential Recommendation via Stochastic Self-Attention

Sequential recommendation models the dynamics of a user's previous behav...

Please sign up or login with your details

Forgot password? Click here to reset