Pre-train, Prompt and Recommendation: A Comprehensive Survey of Language Modelling Paradigm Adaptations in Recommender Systems

02/07/2023
by   Peng Liu, et al.
0

The emergency of Pre-trained Language Models (PLMs) has achieved tremendous success in the field of Natural Language Processing (NLP) by learning universal representations on large corpora in a self-supervised manner. The pre-trained models and the learned representations can be beneficial to a series of downstream NLP tasks. This training paradigm has recently been adapted to the recommendation domain and is considered a promising approach by both academia and industry. In this paper, we systematically investigate how to extract and transfer knowledge from pre-trained models learned by different PLM-related training paradigms to improve recommendation performance from various perspectives, such as generality, sparsity, efficiency and effectiveness. Specifically, we propose an orthogonal taxonomy to divide existing PLM-based recommender systems w.r.t. their training strategies and objectives. Then, we analyze and summarize the connection between PLM-based training paradigms and different input data types for recommender systems. Finally, we elaborate on open issues and future research directions in this vibrant field.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/18/2020

Pre-trained Models for Natural Language Processing: A Survey

Recently, the emergence of pre-trained models (PTMs) has brought natural...
research
05/31/2023

A Survey on Large Language Models for Recommendation

Large Language Models (LLMs) have emerged as powerful tools in the field...
research
06/09/2023

How Can Recommender Systems Benefit from Large Language Models: A Survey

Recommender systems (RS) play important roles to match users' informatio...
research
03/29/2022

Self-Supervised Learning for Recommender Systems: A Survey

Neural architecture-based recommender systems have achieved tremendous s...
research
09/26/2021

Paradigm Shift in Natural Language Processing

In the era of deep learning, modeling for most NLP tasks has converged t...
research
05/24/2023

Exploring Adapter-based Transfer Learning for Recommender Systems: Empirical Studies and Practical Insights

Adapters, a plug-in neural network module with some tunable parameters, ...
research
05/18/2023

A Survey on Time-Series Pre-Trained Models

Time-Series Mining (TSM) is an important research area since it shows gr...

Please sign up or login with your details

Forgot password? Click here to reset