BERT-QE: Contextualized Query Expansion for Document Re-ranking

by   Zhi Zheng, et al.

Query expansion aims to mitigate the mismatch between the language used in a query and in a document. Query expansion methods can suffer from introducing non-relevant information when expanding the query, however. To bridge this gap, inspired by recent advances in applying contextualized models like BERT to the document retrieval task, this paper proposes a novel query expansion model that leverages the strength of the BERT model to better select relevant information for expansion. In evaluations on the standard TREC Robust04 and GOV2 test collections, the proposed BERT-QE model significantly outperforms BERT-Large models commonly used for document retrieval.



page 1

page 2

page 3

page 4


CEQE: Contextualized Embeddings for Query Expansion

In this work we leverage recent advances in context-sensitive language m...

Expansion via Prediction of Importance with Contextualization

The identification of relevance with little textual context is a primary...

Using Query Expansion in Manifold Ranking for Query-Oriented Multi-Document Summarization

Manifold ranking has been successfully applied in query-oriented multi-d...

Neural Document Expansion with User Feedback

This paper presents a neural document expansion approach (NeuDEF) that e...

GRAPHENE: A Precise Biomedical Literature Retrieval Engine with Graph Augmented Deep Learning and External Knowledge Empowerment

Effective biomedical literature retrieval (BLR) plays a central role in ...

IR-BERT: Leveraging BERT for Semantic Search in Background Linking for News Articles

This work describes our two approaches for the background linking task o...

BERT Rankers are Brittle: a Study using Adversarial Document Perturbations

Contextual ranking models based on BERT are now well established for a w...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.