Contextualized Word Representations for Document Re-Ranking

04/15/2019
by   Sean MacAvaney, et al.
0

Although considerable attention has been given to neural ranking architectures recently, far less attention has been paid to the term representations that are used as input to these models. In this work, we investigate how two pretrained contextualized language modes (ELMo and BERT) can be utilized for ad-hoc document ranking. Through experiments on TREC benchmarks, we find that several existing neural ranking architectures can benefit from the additional context provided by contextualized language models. Furthermore, we propose a joint approach that incorporates BERT's classification vector into existing neural models and show that it outperforms state-of-the-art ad-hoc ranking baselines. We also address practical challenges in using these models for ranking, including the maximum input length imposed by BERT and runtime performance impacts of contextualized language models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/15/2019

CEDR: Contextualized Embeddings for Document Ranking

Although considerable attention has been given to neural ranking archite...
research
08/13/2021

TPRM: A Topic-based Personalized Ranking Model for Web Search

Ranking models have achieved promising results, but it remains challengi...
research
11/02/2020

ABNIRML: Analyzing the Behavior of Neural IR Models

Numerous studies have demonstrated the effectiveness of pretrained conte...
research
05/19/2020

Table Search Using a Deep Contextualized Language Model

Pretrained contextualized language models such as BERT have achieved imp...
research
08/11/2020

Hybrid Ranking Network for Text-to-SQL

In this paper, we study how to leverage pre-trained language models in T...
research
02/24/2022

Finding Inverse Document Frequency Information in BERT

For many decades, BM25 and its variants have been the dominant document ...
research
03/26/2019

Simple Applications of BERT for Ad Hoc Document Retrieval

Following recent successes in applying BERT to question answering, we ex...

Please sign up or login with your details

Forgot password? Click here to reset