CERES: Pretraining of Graph-Conditioned Transformer for Semi-Structured Session Data

04/08/2022
by   Rui Feng, et al.
0

User sessions empower many search and recommendation tasks on a daily basis. Such session data are semi-structured, which encode heterogeneous relations between queries and products, and each item is described by the unstructured text. Despite recent advances in self-supervised learning for text or graphs, there lack of self-supervised learning models that can effectively capture both intra-item semantics and inter-item interactions for semi-structured sessions. To fill this gap, we propose CERES, a graph-based transformer model for semi-structured session data. CERES learns representations that capture both inter- and intra-item semantics with (1) a graph-conditioned masked language pretraining task that jointly learns from item text and item-item relations; and (2) a graph-conditioned transformer architecture that propagates inter-item contexts to item-level representations. We pretrained CERES using  468 million Amazon sessions and find that CERES outperforms strong pretraining baselines by up to 9

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/29/2019

Balancing Multi-level Interactions for Session-based Recommendation

Predicting user actions based on anonymous sessions is a challenge to ge...
research
12/16/2021

Knowledge-enhanced Session-based Recommendation with Temporal Transformer

Recent research has achieved impressive progress in the session-based re...
research
12/12/2020

Self-Supervised Hypergraph Convolutional Networks for Session-based Recommendation

Session-based recommendation (SBR) focuses on next-item prediction at a ...
research
08/24/2021

Self-Supervised Graph Co-Training for Session-based Recommendation

Session-based recommendation targets next-item prediction by exploiting ...
research
10/08/2021

Graph-Enhanced Multi-Task Learning of Multi-Level Transition Dynamics for Session-based Recommendation

Session-based recommendation plays a central role in a wide spectrum of ...
research
05/25/2023

UniTRec: A Unified Text-to-Text Transformer and Joint Contrastive Learning Framework for Text-based Recommendation

Prior study has shown that pretrained language models (PLM) can boost th...
research
06/23/2022

Do Trajectories Encode Verb Meaning?

Distributional models learn representations of words from text, but are ...

Please sign up or login with your details

Forgot password? Click here to reset