Rethink Training of BERT Rerankers in Multi-Stage Retrieval Pipeline

01/21/2021
by   Luyu Gao, et al.
0

Pre-trained deep language models (LM) have advanced the state-of-the-art of text retrieval. Rerankers fine-tuned from deep LM estimates candidate relevance based on rich contextualized matching signals. Meanwhile, deep LMs can also be leveraged to improve search index, building retrievers with better recall. One would expect a straightforward combination of both in a pipeline to have additive performance gain. In this paper, we discover otherwise and that popular reranker cannot fully exploit the improved retrieval result. We, therefore, propose a Localized Contrastive Estimation (LCE) for training rerankers and demonstrate it significantly improves deep two-stage models.

READ FULL TEXT
research
05/21/2022

HLATR: Enhance Multi-stage Text Retrieval with Hybrid List Aware Transformer Reranking

Deep pre-trained language models (e,g. BERT) are effective at large-scal...
research
08/15/2022

Continuous Active Learning Using Pretrained Transformers

Pre-trained and fine-tuned transformer models like BERT and T5 have impr...
research
09/18/2023

Image-Text Pre-Training for Logo Recognition

Open-set logo recognition is commonly solved by first detecting possible...
research
04/24/2021

Learning Passage Impacts for Inverted Indexes

Neural information retrieval systems typically use a cascading pipeline,...
research
09/08/2019

Transfer Learning Robustness in Multi-Class Categorization by Fine-Tuning Pre-Trained Contextualized Language Models

This study compares the effectiveness and robustness of multi-class cate...
research
08/16/2022

CorpusBrain: Pre-train a Generative Retrieval Model for Knowledge-Intensive Language Tasks

Knowledge-intensive language tasks (KILT) usually require a large body o...
research
01/23/2023

Injecting the BM25 Score as Text Improves BERT-Based Re-rankers

In this paper we propose a novel approach for combining first-stage lexi...

Please sign up or login with your details

Forgot password? Click here to reset