Parameter-Efficient Transfer from Sequential Behaviors for User Profiling and Recommendation

01/13/2020
by   Fajie Yuan, et al.
0

Inductive transfer learning has greatly impacted the computer vision and NLP domains, but existing approaches in recommender systems remain largely unexplored. Meanwhile, although there has been a large body of research making direct recommendations based on user behavior sequences, few of them attempt to represent and transfer these behaviors for downstream tasks. In this paper, we look in particular at the task of effectively learning a single user representation that can be applied to a diversity of tasks, from cross-domain recommendations to user profile predictions. Fine-tuning a large pre-trained network and adapting it to downstream tasks is an effective way to solve such an issue. However, fine-tuning is parameter inefficient considering that an entire model needs to be re-trained for every new task. To overcome this issue, we develop a parameter-efficient transfer learning architecture, termed as PeterRec, which can be configured on-the-fly to various downstream tasks. Specifically, PeterRec allows the pre-trained parameters unaltered during fine-tuning by injecting a series of re-learned neural networks, which are small but as expressive as learning the entire network. We perform extensive experimental ablation to show the effectiveness of learned user representation in five downstream tasks. Moreover, we show that PeterRec performs efficient transfer learning in multiple domains, where it achieves comparable or sometimes better performance relative to fine-tuning the entire model parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/13/2020

Parameter-Efficient Transfer from Sequential Behaviors for User Modeling and Recommendation

Inductive transfer learning has had a big impact on computer vision and ...
research
10/24/2020

Efficiently Mitigating Classification Bias via Transfer Learning

Prediction bias in machine learning models refers to unintended model be...
research
02/02/2019

Parameter-Efficient Transfer Learning for NLP

Fine-tuning large pre-trained models is an effective transfer mechanism ...
research
12/06/2022

Visual Query Tuning: Towards Effective Usage of Intermediate Representations for Parameter and Memory Efficient Transfer Learning

Intermediate features of a pre-trained model have been shown informative...
research
12/06/2022

Parameter Efficient Transfer Learning for Various Speech Processing Tasks

Fine-tuning of self-supervised models is a powerful transfer learning me...
research
05/28/2023

One Network, Many Masks: Towards More Parameter-Efficient Transfer Learning

Fine-tuning pre-trained language models for multiple tasks tends to be e...
research
05/11/2017

Incremental Learning Through Deep Adaptation

Given an existing trained neural network, it is often desirable to be ab...

Please sign up or login with your details

Forgot password? Click here to reset