Investigating the Successes and Failures of BERT for Passage Re-Ranking

05/05/2019
by   Harshith Padigela, et al.
0

The bidirectional encoder representations from transformers (BERT) model has recently advanced the state-of-the-art in passage re-ranking. In this paper, we analyze the results produced by a fine-tuned BERT model to better understand the reasons behind such substantial improvements. To this aim, we focus on the MS MARCO passage re-ranking dataset and provide potential reasons for the successes and failures of BERT for retrieval. In more detail, we empirically study a set of hypotheses and provide additional analysis to explain the successful performance of BERT.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/17/2020

DSC IIT-ISM at SemEval-2020 Task 6: Boosting BERT with Dependencies for Definition Extraction

We explore the performance of Bidirectional Encoder Representations from...
research
04/05/2022

How Different are Pre-trained Transformers for Text Ranking?

In recent years, large pre-trained transformers have led to substantial ...
research
01/31/2022

Learning affective meanings that derives the social behavior using Bidirectional Encoder Representations from Transformers

Predicting the outcome of a process requires modeling the system dynamic...
research
10/12/2022

RankT5: Fine-Tuning T5 for Text Ranking with Ranking Losses

Recently, substantial progress has been made in text ranking based on pr...
research
11/11/2019

Understanding BERT performance in propaganda analysis

In this paper, we describe our system used in the shared task for fine-g...
research
10/19/2020

BERTnesia: Investigating the capture and forgetting of knowledge in BERT

Probing complex language models has recently revealed several insights i...
research
05/02/2023

Cancer Hallmark Classification Using Bidirectional Encoder Representations From Transformers

This paper presents a novel approach to accurately classify the hallmark...

Please sign up or login with your details

Forgot password? Click here to reset