Contrastive Fine-tuning Improves Robustness for Neural Rankers

05/27/2021
by   Xiaofei Ma, et al.
0

The performance of state-of-the-art neural rankers can deteriorate substantially when exposed to noisy inputs or applied to a new domain. In this paper, we present a novel method for fine-tuning neural rankers that can significantly improve their robustness to out-of-domain data and query perturbations. Specifically, a contrastive loss that compares data points in the representation space is combined with the standard ranking loss during fine-tuning. We use relevance labels to denote similar/dissimilar pairs, which allows the model to learn the underlying matching semantics across different query-document pairs and leads to improved robustness. In experiments with four passage ranking datasets, the proposed contrastive fine-tuning method obtains improvements on robustness to query reformulations, noise perturbations, and zero-shot transfer for both BERT and BART based rankers. Additionally, our experiments show that contrastive fine-tuning outperforms data augmentation for robustifying neural rankers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/07/2022

Supervised Contrastive Learning Approach for Contextual Ranking

Contextual ranking models have delivered impressive performance improvem...
research
08/06/2023

Improving Domain-Specific Retrieval by NLI Fine-Tuning

The aim of this article is to investigate the fine-tuning potential of n...
research
08/20/2021

Contrastive Representations for Label Noise Require Fine-Tuning

In this paper we show that the combination of a Contrastive representati...
research
09/21/2023

Audio Contrastive based Fine-tuning

Audio classification plays a crucial role in speech and sound processing...
research
10/28/2021

Semi-Siamese Bi-encoder Neural Ranking Model Using Lightweight Fine-Tuning

A BERT-based Neural Ranking Model (NRM) can be either a cross-encoder or...
research
11/08/2022

ConsPrompt: Easily Exploiting Contrastive Samples for Few-shot Prompt Learning

Prompt learning recently become an effective linguistic tool to motivate...
research
04/19/2021

A Framework using Contrastive Learning for Classification with Noisy Labels

We propose a framework using contrastive learning as a pre-training task...

Please sign up or login with your details

Forgot password? Click here to reset