Co-BERT: A Context-Aware BERT Retrieval Model Incorporating Local and Query-specific Context

04/17/2021
by   Xiaoyang Chen, et al.
0

BERT-based text ranking models have dramatically advanced the state-of-the-art in ad-hoc retrieval, wherein most models tend to consider individual query-document pairs independently. In the mean time, the importance and usefulness to consider the cross-documents interactions and the query-specific characteristics in a ranking model have been repeatedly confirmed, mostly in the context of learning to rank. The BERT-based ranking model, however, has not been able to fully incorporate these two types of ranking context, thereby ignoring the inter-document relationships from the ranking and the differences among queries. To mitigate this gap, in this work, an end-to-end transformer-based ranking model, named Co-BERT, has been proposed to exploit several BERT architectures to calibrate the query-document representations using pseudo relevance feedback before modeling the relevance of a group of documents jointly. Extensive experiments on two standard test collections confirm the effectiveness of the proposed model in improving the performance of text re-ranking over strong fine-tuned BERT-Base baselines. We plan to make our implementation open source to enable further comparisons.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/25/2022

Groupwise Query Performance Prediction with BERT

While large-scale pre-trained language models like BERT have advanced th...
research
08/31/2023

Context Aware Query Rewriting for Text Rankers using LLM

Query rewriting refers to an established family of approaches that are a...
research
01/12/2021

On the Calibration and Uncertainty of Neural Learning to Rank Models

According to the Probability Ranking Principle (PRP), ranking documents ...
research
06/05/2019

Context Attentive Document Ranking and Query Suggestion

We present a context-aware neural ranking model to exploit users' on-tas...
research
08/20/2020

PARADE: Passage Representation Aggregation for Document Reranking

We present PARADE, an end-to-end Transformer-based model that considers ...
research
05/25/2023

Enhancing the Ranking Context of Dense Retrieval Methods through Reciprocal Nearest Neighbors

Sparse annotation poses persistent challenges to training dense retrieva...

Please sign up or login with your details

Forgot password? Click here to reset