Generative Query Reformulation for Effective Adhoc Search

08/01/2023
by   Xiao Wang, et al.
0

Performing automatic reformulations of a user's query is a popular paradigm used in information retrieval (IR) for improving effectiveness – as exemplified by the pseudo-relevance feedback approaches, which expand the query in order to alleviate the vocabulary mismatch problem. Recent advancements in generative language models have demonstrated their ability in generating responses that are relevant to a given prompt. In light of this success, we seek to study the capacity of such models to perform query reformulation and how they compare with long-standing query reformulation methods that use pseudo-relevance feedback. In particular, we investigate two representative query reformulation frameworks, GenQR and GenPRF. GenQR directly reformulates the user's input query, while GenPRF provides additional context for the query by making use of pseudo-relevance feedback information. For each reformulation method, we leverage different techniques, including fine-tuning and direct prompting, to harness the knowledge of language models. The reformulated queries produced by the generative models are demonstrated to markedly benefit the effectiveness of a state-of-the-art retrieval pipeline on four TREC test collections (varying from TREC 2004 Robust to the TREC 2019 Deep Learning). Furthermore, our results indicate that our studied generative models can outperform various statistical query expansion approaches while remaining comparable to other existing complex neural query reformulation models, with the added benefit of being simpler to implement.

READ FULL TEXT
research
04/25/2023

Generative Relevance Feedback with Large Language Models

Current query expansion models use pseudo-relevance feedback to improve ...
research
05/12/2023

Generative and Pseudo-Relevant Feedback for Sparse, Dense and Learned Sparse Retrieval

Pseudo-relevance feedback (PRF) is a classical approach to address lexic...
research
08/13/2021

GQE-PRF: Generative Query Expansion with Pseudo-Relevance Feedback

Query expansion with pseudo-relevance feedback (PRF) is a powerful appro...
research
06/25/2021

A Modern Perspective on Query Likelihood with Deep Generative Retrieval Models

Existing neural ranking models follow the text matching paradigm, where ...
research
04/06/2022

From Little Things Big Things Grow: A Collection with Seed Studies for Medical Systematic Review Literature Search

Medical systematic review query formulation is a highly complex task don...
research
04/25/2022

LoL: A Comparative Regularization Loss over Query Reformulation Losses for Pseudo-Relevance Feedback

Pseudo-relevance feedback (PRF) has proven to be an effective query refo...
research
08/25/2021

Pseudo Relevance Feedback with Deep Language Models and Dense Retrievers: Successes and Pitfalls

Pseudo Relevance Feedback (PRF) is known to improve the effectiveness of...

Please sign up or login with your details

Forgot password? Click here to reset