Continuous Active Learning Using Pretrained Transformers

08/15/2022
by   Nima Sadri, et al.
13

Pre-trained and fine-tuned transformer models like BERT and T5 have improved the state of the art in ad-hoc retrieval and question-answering, but not as yet in high-recall information retrieval, where the objective is to retrieve substantially all relevant documents. We investigate whether the use of transformer-based models for reranking and/or featurization can improve the Baseline Model Implementation of the TREC Total Recall Track, which represents the current state of the art for high-recall information retrieval. We also introduce CALBERT, a model that can be used to continuously fine-tune a BERT-based model based on relevance feedback.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/23/2018

Evaluating Sentence-Level Relevance Feedback for High-Recall Information Retrieval

This study uses a novel simulation framework to evaluate whether the tim...
research
05/03/2022

Predicting Issue Types with seBERT

Pre-trained transformer models are the current state-of-the-art for natu...
research
01/21/2021

Rethink Training of BERT Rerankers in Multi-Stage Retrieval Pipeline

Pre-trained deep language models (LM) have advanced the state-of-the-art...
research
04/05/2022

How Different are Pre-trained Transformers for Text Ranking?

In recent years, large pre-trained transformers have led to substantial ...
research
02/14/2022

DS4DH at TREC Health Misinformation 2021: Multi-Dimensional Ranking Models with Transfer Learning and Rank Fusion

This paper describes the work of the Data Science for Digital Health (DS...
research
02/14/2023

Enhancing Model Performance in Multilingual Information Retrieval with Comprehensive Data Engineering Techniques

In this paper, we present our solution to the Multilingual Information R...
research
12/17/2020

A White Box Analysis of ColBERT

Transformer-based models are nowadays state-of-the-art in ad-hoc Informa...

Please sign up or login with your details

Forgot password? Click here to reset