Pseudo-Relevance Feedback for Multiple Representation Dense Retrieval

06/21/2021
by   Xiao Wang, et al.
0

Pseudo-relevance feedback mechanisms, from Rocchio to the relevance models, have shown the usefulness of expanding and reweighting the users' initial queries using information occurring in an initial set of retrieved documents, known as the pseudo-relevant set. Recently, dense retrieval – through the use of neural contextual language models such as BERT for analysing the documents' and queries' contents and computing their relevance scores – has shown a promising performance on several information retrieval tasks still relying on the traditional inverted index for identifying documents relevant to a query. Two different dense retrieval families have emerged: the use of single embedded representations for each passage and query (e.g. using BERT's [CLS] token), or via multiple representations (e.g. using an embedding for each token of the query and document). In this work, we conduct the first study into the potential for multiple representation dense retrieval to be enhanced using pseudo-relevance feedback. In particular, based on the pseudo-relevant set of documents identified using a first-pass dense retrieval, we extract representative feedback embeddings (using KMeans clustering) – while ensuring that these embeddings discriminate among passages (based on IDF) – which are then added to the query representation. These additional feedback embeddings are shown to both enhance the effectiveness of a reranking as well as an additional dense retrieval operation. Indeed, experiments on the MSMARCO passage ranking dataset show that MAP can be improved by upto 26 2019 query set and 10 proposed ColBERT-PRF method on a ColBERT dense retrieval approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/30/2021

Improving Query Representations for Dense Retrieval with Pseudo Relevance Feedback

Dense retrieval systems conduct first-stage retrieval using embedded rep...
research
04/25/2022

LoL: A Comparative Regularization Loss over Query Reformulation Losses for Pseudo-Relevance Feedback

Pseudo-relevance feedback (PRF) has proven to be an effective query refo...
research
08/20/2023

Offline Pseudo Relevance Feedback for Efficient and Effective Single-pass Dense Retrieval

Dense retrieval has made significant advancements in information retriev...
research
10/19/2022

Incorporating Relevance Feedback for Information-Seeking Retrieval using Few-Shot Document Re-Ranking

Pairing a lexical retriever with a neural re-ranking model has set state...
research
08/13/2021

On Single and Multiple Representations in Dense Passage Retrieval

The advent of contextualised language models has brought gains in search...
research
09/01/2022

Isotropic Representation Can Improve Dense Retrieval

The recent advancement in language representation modeling has broadly a...
research
04/15/2021

UHD-BERT: Bucketed Ultra-High Dimensional Sparse Representations for Full Ranking

Neural information retrieval (IR) models are promising mainly because th...

Please sign up or login with your details

Forgot password? Click here to reset