Parameter-Efficient Sparse Retrievers and Rerankers using Adapters

03/23/2023
by   Vaishali Pal, et al.
0

Parameter-Efficient transfer learning with Adapters have been studied in Natural Language Processing (NLP) as an alternative to full fine-tuning. Adapters are memory-efficient and scale well with downstream tasks by training small bottle-neck layers added between transformer layers while keeping the large pretrained language model (PLMs) frozen. In spite of showing promising results in NLP, these methods are under-explored in Information Retrieval. While previous studies have only experimented with dense retriever or in a cross lingual retrieval scenario, in this paper we aim to complete the picture on the use of adapters in IR. First, we study adapters for SPLADE, a sparse retriever, for which adapters not only retain the efficiency and effectiveness otherwise achieved by finetuning, but are memory-efficient and orders of magnitude lighter to train. We observe that Adapters-SPLADE not only optimizes just 2% of training parameters, but outperforms fully fine-tuned counterpart and existing parameter-efficient dense IR models on IR benchmark datasets. Secondly, we address domain adaptation of neural retrieval thanks to adapters on cross-domain BEIR datasets and TripClick. Finally, we also consider knowledge sharing between rerankers and first stage rankers. Overall, our study complete the examination of adapters for neural IR

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/21/2022

Scattered or Connected? An Optimized Parameter-efficient Tuning Approach for Information Retrieval

Pre-training and fine-tuning have achieved significant advances in the i...
research
06/06/2021

On the Effectiveness of Adapter-based Tuning for Pretrained Language Model Adaptation

Adapter-based tuning has recently arisen as an alternative to fine-tunin...
research
05/25/2022

Know Where You're Going: Meta-Learning for Parameter-Efficient Fine-tuning

A recent family of techniques, dubbed as lightweight fine-tuning methods...
research
08/15/2022

Conv-Adapter: Exploring Parameter Efficient Transfer Learning for ConvNets

While parameter efficient tuning (PET) methods have shown great potentia...
research
12/16/2021

Towards Unsupervised Dense Information Retrieval with Contrastive Learning

Information retrieval is an important component in natural language proc...
research
09/25/2022

An Empirical Study on Cross-X Transfer for Legal Judgment Prediction

Cross-lingual transfer learning has proven useful in a variety of Natura...
research
08/29/2023

Improving Neural Ranking Models with Traditional IR Methods

Neural ranking methods based on large transformer models have recently g...

Please sign up or login with your details

Forgot password? Click here to reset