BERT-QE: Contextualized Query Expansion for Document Re-ranking

09/15/2020
by   Zhi Zheng, et al.
0

Query expansion aims to mitigate the mismatch between the language used in a query and in a document. Query expansion methods can suffer from introducing non-relevant information when expanding the query, however. To bridge this gap, inspired by recent advances in applying contextualized models like BERT to the document retrieval task, this paper proposes a novel query expansion model that leverages the strength of the BERT model to better select relevant information for expansion. In evaluations on the standard TREC Robust04 and GOV2 test collections, the proposed BERT-QE model significantly outperforms BERT-Large models commonly used for document retrieval.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/26/2023

BERT-Embedding and Citation Network Analysis based Query Expansion Technique for Scholarly Search

The enormous growth of research publications has made it challenging for...
research
03/09/2021

CEQE: Contextualized Embeddings for Query Expansion

In this work we leverage recent advances in context-sensitive language m...
research
04/29/2020

Expansion via Prediction of Importance with Contextualization

The identification of relevance with little textual context is a primary...
research
07/31/2021

Using Query Expansion in Manifold Ranking for Query-Oriented Multi-Document Summarization

Manifold ranking has been successfully applied in query-oriented multi-d...
research
09/15/2023

When do Generative Query and Document Expansions Fail? A Comprehensive Study Across Methods, Retrievers, and Datasets

Using large language models (LMs) for query or document expansion can im...
research
11/02/2019

GRAPHENE: A Precise Biomedical Literature Retrieval Engine with Graph Augmented Deep Learning and External Knowledge Empowerment

Effective biomedical literature retrieval (BLR) plays a central role in ...
research
07/24/2020

IR-BERT: Leveraging BERT for Semantic Search in Background Linking for News Articles

This work describes our two approaches for the background linking task o...

Please sign up or login with your details

Forgot password? Click here to reset