UFNRec: Utilizing False Negative Samples for Sequential Recommendation

08/08/2022
by   Xiaoyang Liu, et al.
0

Sequential recommendation models are primarily optimized to distinguish positive samples from negative ones during training in which negative sampling serves as an essential component in learning the evolving user preferences through historical records. Except for randomly sampling negative samples from a uniformly distributed subset, many delicate methods have been proposed to mine negative samples with high quality. However, due to the inherent randomness of negative sampling, false negative samples are inevitably collected in model training. Current strategies mainly focus on removing such false negative samples, which leads to overlooking potential user interests, lack of recommendation diversity, less model robustness, and suffering from exposure bias. To this end, we propose a novel method that can Utilize False Negative samples for sequential Recommendation (UFNRec) to improve model performance. We first devise a simple strategy to extract false negative samples and then transfer these samples to positive samples in the following training process. Furthermore, we construct a teacher model to provide soft labels for false negative samples and design a consistency loss to regularize the predictions of these samples from the student model and the teacher model. To the best of our knowledge, this is the first work to utilize false negative samples instead of simply removing them for the sequential recommendation. Experiments on three benchmark public datasets are conducted using three widely applied SOTA models. The experiment results demonstrate that our proposed UFNRec can effectively draw information from false negative samples and further improve the performance of SOTA models. The code is available at https://github.com/UFNRec-code/UFNRec.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/07/2022

Generating Negative Samples for Sequential Recommendation

To make Sequential Recommendation (SR) successful, recent works focus on...
research
06/12/2023

Neighborhood-based Hard Negative Mining for Sequential Recommendation

Negative sampling plays a crucial role in training successful sequential...
research
07/27/2023

Scaling Session-Based Transformer Recommendations using Optimized Negative Sampling and Loss Functions

This work introduces TRON, a scalable session-based Transformer Recommen...
research
04/02/2022

Negative Sampling for Recommendation

How to effectively sample high-quality negative instances is important f...
research
03/12/2020

Reinforced Negative Sampling over Knowledge Graph for Recommendation

Properly handling missing data is a fundamental challenge in recommendat...
research
10/21/2022

SimANS: Simple Ambiguous Negatives Sampling for Dense Text Retrieval

Sampling proper negatives from a large document pool is vital to effecti...
research
12/01/2022

Language Model Pre-training on True Negatives

Discriminative pre-trained language models (PLMs) learn to predict origi...

Please sign up or login with your details

Forgot password? Click here to reset