Contrastive Pre-training for Sequential Recommendation

10/27/2020
by   Xu Xie, et al.
0

Sequential recommendation methods play a crucial role in modern recommender systems because of their ability to capture users' dynamic interests from their historical interactions. Despite their success, we argue that these approaches require huge amounts of parameters to learn a high-quality user representation model. However, they usually suffer from the data sparsity problem, which makes it difficult for them to collect sufficient supervised information to optimize the parameters. To tackle that, inspired by recent advances of pre-training techniques in the natural language processing area, we construct the training signal from unsupervised data and then pre-train the user representation model with this information. We propose a novel model called Contrastive Pre-training for Sequential Recommendation (CP4Rec), which utilizes the contrastive pre-training framework to extract meaningful user patterns and further encode the user representation effectively. In addition, we propose three data augmentation approaches to construct pre-training tasks and exploit the effects of the composition of different augmentations. Comprehensive experiments on four public datasets demonstrate that CP4Rec achieves state-of-the-art performance over existing baselines especially when limited training data is available.

READ FULL TEXT
research
02/22/2021

UPRec: User-Aware Pre-training for Recommender Systems

Existing sequential recommendation methods rely on large amounts of trai...
research
03/21/2023

Multimodal Pre-training Framework for Sequential Recommendation via Contrastive Learning

Sequential recommendation systems utilize the sequential interactions of...
research
06/08/2023

COURIER: Contrastive User Intention Reconstruction for Large-Scale Pre-Train of Image Features

With the development of the multi-media internet, visual characteristics...
research
06/12/2021

Curriculum Pre-Training Heterogeneous Subgraph Transformer for Top-N Recommendation

Due to the flexibility in modelling data heterogeneity, heterogeneous in...
research
09/03/2021

UserBERT: Contrastive User Model Pre-training

User modeling is critical for personalized web applications. Existing us...
research
10/13/2021

False Negative Distillation and Contrastive Learning for Personalized Outfit Recommendation

Personalized outfit recommendation has recently been in the spotlight wi...
research
09/02/2022

IMG2IMU: Applying Knowledge from Large-Scale Images to IMU Applications via Contrastive Learning

Recent advances in machine learning showed that pre-training representat...

Please sign up or login with your details

Forgot password? Click here to reset