Self-Supervised Contrastive BERT Fine-tuning for Fusion-based Reviewed-Item Retrieval

As natural language interfaces enable users to express increasingly complex natural language queries, there is a parallel explosion of user review content that can allow users to better find items such as restaurants, books, or movies that match these expressive queries. While Neural Information Retrieval (IR) methods have provided state-of-the-art results for matching queries to documents, they have not been extended to the task of Reviewed-Item Retrieval (RIR), where query-review scores must be aggregated (or fused) into item-level scores for ranking. In the absence of labeled RIR datasets, we extend Neural IR methodology to RIR by leveraging self-supervised methods for contrastive learning of BERT embeddings for both queries and reviews. Specifically, contrastive learning requires a choice of positive and negative samples, where the unique two-level structure of our item-review data combined with meta-data affords us a rich structure for the selection of these samples. For contrastive learning in a Late Fusion scenario, we investigate the use of positive review samples from the same item and/or with the same rating, selection of hard positive samples by choosing the least similar reviews from the same anchor item, and selection of hard negative samples by choosing the most similar reviews from different items. We also explore anchor sub-sampling and augmenting with meta-data. For a more end-to-end Early Fusion approach, we introduce contrastive item embedding learning to fuse reviews into single item embeddings. Experimental results show that Late Fusion contrastive learning for Neural RIR outperforms all other contrastive IR configurations, Neural IR, and sparse retrieval baselines, thus demonstrating the power of exploiting the two-level structure in Neural RIR approaches as well as the importance of preserving the nuance of individual review content via Late Fusion methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/26/2022

A Review-aware Graph Contrastive Learning Framework for Recommendation

Most modern recommender systems predict users preferences with two compo...
research
01/10/2022

Supervised Contrastive Learning for Recommendation

Compared with the traditional collaborative filtering methods, the graph...
research
08/10/2022

Non-Contrastive Self-Supervised Learning of Utterance-Level Speech Representations

Considering the abundance of unlabeled speech data and the high labeling...
research
08/18/2021

SIFN: A Sentiment-aware Interactive Fusion Network for Review-based Item Recommendation

Recent studies in recommender systems have managed to achieve significan...
research
09/03/2023

Multimodal Contrastive Learning with Hard Negative Sampling for Human Activity Recognition

Human Activity Recognition (HAR) systems have been extensively studied b...
research
08/26/2020

Item Tagging for Information Retrieval: A Tripartite Graph Neural Network based Approach

Tagging has been recognized as a successful practice to boost relevance ...
research
11/23/2022

Prototypical Contrastive Learning and Adaptive Interest Selection for Candidate Generation in Recommendations

Deep Candidate Generation plays an important role in large-scale recomme...

Please sign up or login with your details

Forgot password? Click here to reset