Bridging the Training-Inference Gap for Dense Phrase Retrieval

10/25/2022
by   Gyuwan Kim, et al.
0

Building dense retrievers requires a series of standard procedures, including training and validating neural models and creating indexes for efficient search. However, these procedures are often misaligned in that training objectives do not exactly reflect the retrieval scenario at inference time. In this paper, we explore how the gap between training and inference in dense retrieval can be reduced, focusing on dense phrase retrieval (Lee et al., 2021) where billions of representations are indexed at inference. Since validating every dense retriever with a large-scale index is practically infeasible, we propose an efficient way of validating dense retrievers using a small subset of the entire corpus. This allows us to validate various training strategies including unifying contrastive loss terms and using hard negatives for phrase retrieval, which largely reduces the training-inference discrepancy. As a result, we improve top-1 phrase retrieval accuracy by 2 3 points and top-20 passage retrieval accuracy by 2 4 points for open-domain question answering. Our work urges modeling dense retrievers with careful consideration of training and inference via efficient validation while advancing phrase retrieval as a general solution for dense retrieval.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/23/2020

Learning Dense Representations of Phrases at Scale

Open-domain question answering can be reformulated as a phrase retrieval...
research
06/28/2023

Confidence-Calibrated Ensemble Dense Phrase Retrieval

In this paper, we consider the extent to which the transformer-based Den...
research
09/16/2021

Phrase Retrieval Learns Passage Retrieval, Too

Dense retrieval methods have shown great promise over sparse retrieval m...
research
10/13/2021

Salient Phrase Aware Dense Retrieval: Can a Dense Retriever Imitate a Sparse One?

Despite their recent popularity and well known advantages, dense retriev...
research
11/07/2019

Contextualized Sparse Representation with Rectified N-Gram Attention for Open-Domain Question Answering

A sparse representation is known to be an effective means to encode prec...
research
12/14/2021

Boosted Dense Retriever

We propose DrBoost, a dense retrieval ensemble inspired by boosting. DrB...
research
03/19/2022

Clickbait Spoiling via Question Answering and Passage Retrieval

We introduce and study the task of clickbait spoiling: generating a shor...

Please sign up or login with your details

Forgot password? Click here to reset