Contrastive Learning with Bidirectional Transformers for Sequential Recommendation

08/08/2022
by   Hanwen Du, et al.
0

Contrastive learning with Transformer-based sequence encoder has gained predominance for sequential recommendation. It maximizes the agreements between paired sequence augmentations that share similar semantics. However, existing contrastive learning approaches in sequential recommendation mainly center upon left-to-right unidirectional Transformers as base encoders, which are suboptimal for sequential recommendation because user behaviors may not be a rigid left-to-right sequence. To tackle that, we propose a novel framework named Contrastive learning with Bidirectional Transformers for sequential recommendation (CBiT). Specifically, we first apply the slide window technique for long user sequences in bidirectional Transformers, which allows for a more fine-grained division of user sequences. Then we combine the cloze task mask and the dropout mask to generate high-quality positive samples and perform multi-pair contrastive learning, which demonstrates better performance and adaptability compared with the normal one-pair contrastive learning. Moreover, we introduce a novel dynamic loss reweighting strategy to balance between the cloze task loss and the contrastive loss. Experiment results on three public benchmark datasets show that our model outperforms state-of-the-art models for sequential recommendation.

READ FULL TEXT
research
04/14/2019

BERT4Rec: Sequential Recommendation with Bidirectional Encoder Representations from Transformer

Modeling users' dynamic and evolving preferences from their historical b...
research
06/13/2019

Contrastive Bidirectional Transformer for Temporal Representation Learning

This paper aims at learning representations for long sequences of contin...
research
08/07/2023

Hierarchical Contrastive Learning with Multiple Augmentation for Sequential Recommendation

Sequential recommendation addresses the issue of preference drift by pre...
research
10/12/2021

Contrastive Learning for Representation Degeneration Problem in Sequential Recommendation

Recent advancements of sequential deep learning models such as Transform...
research
09/01/2021

Memory Augmented Multi-Instance Contrastive Predictive Coding for Sequential Recommendation

The sequential recommendation aims to recommend items, such as products,...
research
01/22/2023

Debiasing the Cloze Task in Sequential Recommendation with Bidirectional Transformers

Bidirectional Transformer architectures are state-of-the-art sequential ...
research
04/30/2022

Designing a Sequential Recommendation System for Heterogeneous Interactions Using Transformers

While many production-ready and robust algorithms are available for the ...

Please sign up or login with your details

Forgot password? Click here to reset