Collaborative Distillation for Top-N Recommendation

11/13/2019
by   Jae-woong Lee, et al.
0

Knowledge distillation (KD) is a well-known method to reduce inference latency by compressing a cumbersome teacher model to a small student model. Despite the success of KD in the classification task, applying KD to recommender models is challenging due to the sparsity of positive feedback, the ambiguity of missing feedback, and the ranking problem associated with the top-N recommendation. To address the issues, we propose a new KD model for the collaborative filtering approach, namely collaborative distillation (CD). Specifically, (1) we reformulate a loss function to deal with the ambiguity of missing feedback. (2) We exploit probabilistic rank-aware sampling for the top-N recommendation. (3) To train the proposed model effectively, we develop two training strategies for the student model, called the teacher- and the student-guided training methods, selecting the most useful feedback from the teacher model. Via experimental results, we demonstrate that the proposed model outperforms the state-of-the-art method by 2.7-33.2 (HR) and normalized discounted cumulative gain (NDCG), respectively. Moreover, the proposed model achieves the performance comparable to the teacher model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/19/2018

Ranking Distillation: Learning Compact Ranking Models With High Performance for Recommender System

We propose a novel way to train ranking models, such as recommender syst...
research
12/08/2020

DE-RRD: A Knowledge Distillation Framework for Recommender System

Recent recommender systems have started to employ knowledge distillation...
research
09/08/2021

Dual Correction Strategy for Ranking Distillation in Top-N Recommender System

Knowledge Distillation (KD), which transfers the knowledge of a well-tra...
research
06/05/2021

Bidirectional Distillation for Top-K Recommender System

Recommender systems (RS) have started to employ knowledge distillation, ...
research
07/15/2021

An Educational System for Personalized Teacher Recommendation in K-12 Online Classrooms

In this paper, we propose a simple yet effective solution to build pract...
research
10/13/2021

False Negative Distillation and Contrastive Learning for Personalized Outfit Recommendation

Personalized outfit recommendation has recently been in the spotlight wi...
research
01/18/2022

Emergent Instabilities in Algorithmic Feedback Loops

Algorithms that aid human tasks, such as recommendation systems, are ubi...

Please sign up or login with your details

Forgot password? Click here to reset