Transfer Learning Approaches for Building Cross-Language Dense Retrieval Models

01/20/2022
by   Suraj Nair, et al.
12

The advent of transformer-based models such as BERT has led to the rise of neural ranking models. These models have improved the effectiveness of retrieval systems well beyond that of lexical term matching models such as BM25. While monolingual retrieval tasks have benefited from large-scale training collections such as MS MARCO and advances in neural architectures, cross-language retrieval tasks have fallen behind these advancements. This paper introduces ColBERT-X, a generalization of the ColBERT multi-representation dense retrieval model that uses the XLM-RoBERTa (XLM-R) encoder to support cross-language information retrieval (CLIR). ColBERT-X can be trained in two ways. In zero-shot training, the system is trained on the English MS MARCO collection, relying on the XLM-R encoder for cross-language mappings. In translate-train, the system is trained on the MS MARCO English queries coupled with machine translations of the associated MS MARCO passages. Results on ad hoc document ranking tasks in several languages demonstrate substantial and statistically significant improvements of these trained dense retrieval models over traditional lexical CLIR baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/20/2022

Parameter-efficient Zero-shot Transfer for Cross-Language Dense Retrieval with Adapters

A popular approach to creating a zero-shot cross-language retrieval mode...
research
01/02/2022

Establishing Strong Baselines for TripClick Health Retrieval

We present strong Transformer-based re-ranking and dense retrieval basel...
research
01/08/2023

InPars-Light: Cost-Effective Unsupervised Training of Efficient Rankers

We carried out a reproducibility study of InPars recipe for unsupervised...
research
06/29/2022

How Train-Test Leakage Affects Zero-shot Retrieval

Neural retrieval models are often trained on (subsets of) the millions o...
research
06/05/2023

Benchmarking Middle-Trained Language Models for Neural Search

Middle training methods aim to bridge the gap between the Masked Languag...
research
04/29/2023

Synthetic Cross-language Information Retrieval Training Data

A key stumbling block for neural cross-language information retrieval (C...
research
04/05/2022

How Different are Pre-trained Transformers for Text Ranking?

In recent years, large pre-trained transformers have led to substantial ...

Please sign up or login with your details

Forgot password? Click here to reset