Curriculum Learning for Dense Retrieval Distillation

04/28/2022
by   Hansi Zeng, et al.
0

Recent work has shown that more effective dense retrieval models can be obtained by distilling ranking knowledge from an existing base re-ranking model. In this paper, we propose a generic curriculum learning based optimization framework called CL-DRD that controls the difficulty level of training data produced by the re-ranking (teacher) model. CL-DRD iteratively optimizes the dense retrieval (student) model by increasing the difficulty of the knowledge distillation data made available to it. In more detail, we initially provide the student model coarse-grained preference pairs between documents in the teacher's ranking and progressively move towards finer-grained pairwise document ordering requirements. In our experiments, we apply a simple implementation of the CL-DRD framework to enhance two state-of-the-art dense retrieval models. Experiments on three public passage retrieval datasets demonstrate the effectiveness of our proposed framework.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/27/2022

PROD: Progressive Distillation for Dense Retrieval

Knowledge distillation is an effective way to transfer knowledge from a ...
research
09/19/2018

Ranking Distillation: Learning Compact Ranking Models With High Performance for Recommender System

We propose a novel way to train ranking models, such as recommender syst...
research
12/10/2022

LEAD: Liberal Feature-based Distillation for Dense Retrieval

Knowledge distillation is often used to transfer knowledge from a strong...
research
08/29/2022

LED: Lexicon-Enlightened Dense Retriever for Large-Scale Retrieval

Retrieval models based on dense representations in semantic space have b...
research
05/29/2023

GripRank: Bridging the Gap between Retrieval and Generation via the Generative Knowledge Improved Passage Ranking

Retrieval-enhanced text generation, which aims to leverage passages retr...
research
03/02/2023

Distillation from Heterogeneous Models for Top-K Recommendation

Recent recommender systems have shown remarkable performance by using an...
research
10/22/2020

Distilling Dense Representations for Ranking using Tightly-Coupled Teachers

We present an approach to ranking with dense representations that applie...

Please sign up or login with your details

Forgot password? Click here to reset