Multi-Sample based Contrastive Loss for Top-k Recommendation

09/01/2021
by   Hao Tang, et al.
0

The top-k recommendation is a fundamental task in recommendation systems which is generally learned by comparing positive and negative pairs. The Contrastive Loss (CL) is the key in contrastive learning that has received more attention recently and we find it is well suited for top-k recommendations. However, it is a problem that CL treats the importance of the positive and negative samples as the same. On the one hand, CL faces the imbalance problem of one positive sample and many negative samples. On the other hand, positive items are so few in sparser datasets that their importance should be emphasized. Moreover, the other important issue is that the sparse positive items are still not sufficiently utilized in recommendations. So we propose a new data augmentation method by using multiple positive items (or samples) simultaneously with the CL loss function. Therefore, we propose a Multi-Sample based Contrastive Loss (MSCL) function which solves the two problems by balancing the importance of positive and negative samples and data augmentation. And based on the graph convolution network (GCN) method, experimental results demonstrate the state-of-the-art performance of MSCL. The proposed MSCL is simple and can be applied in many methods. We will release our code on GitHub upon the acceptance.

READ FULL TEXT
research
03/09/2021

Doubly Contrastive Deep Clustering

Deep clustering successfully provides more effective features than conve...
research
10/23/2022

Rethinking Rotation in Self-Supervised Contrastive Learning: Adaptive Positive or Negative Data Augmentation

Rotation is frequently listed as a candidate for data augmentation in co...
research
04/04/2023

PartMix: Regularization Strategy to Learn Part Discovery for Visible-Infrared Person Re-identification

Modern data augmentation using a mixture-based technique can regularize ...
research
04/05/2022

Positive and Negative Critiquing for VAE-based Recommenders

Providing explanations for recommended items allows users to refine the ...
research
07/04/2022

Positive-Negative Equal Contrastive Loss for Semantic Segmentation

The contextual information is critical for various computer vision tasks...
research
07/28/2022

Exploiting Negative Preference in Content-based Music Recommendation with Contrastive Learning

Advanced music recommendation systems are being introduced along with th...
research
03/10/2023

Self-supervised Training Sample Difficulty Balancing for Local Descriptor Learning

In the case of an imbalance between positive and negative samples, hard ...

Please sign up or login with your details

Forgot password? Click here to reset