A Static Pruning Study on Sparse Neural Retrievers

04/25/2023
by   Carlos Lassance, et al.
0

Sparse neural retrievers, such as DeepImpact, uniCOIL and SPLADE, have been introduced recently as an efficient and effective way to perform retrieval with inverted indexes. They aim to learn term importance and, in some cases, document expansions, to provide a more effective document ranking compared to traditional bag-of-words retrieval models such as BM25. However, these sparse neural retrievers have been shown to increase the computational costs and latency of query processing compared to their classical counterparts. To mitigate this, we apply a well-known family of techniques for boosting the efficiency of query processing over inverted indexes: static pruning. We experiment with three static pruning strategies, namely document-centric, term-centric and agnostic pruning, and we assess, over diverse datasets, that these techniques still work with sparse neural retrievers. In particular, static pruning achieves 2× speedup with negligible effectiveness loss (≤ 2% drop) and, depending on the use case, even 4× speedup with minimal impact on the effectiveness (≤ 8% drop). Moreover, we show that neural rerankers are robust to candidates from statically pruned indexes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/23/2021

Query Embedding Pruning for Dense Retrieval

Recent advances in dense retrieval techniques have offered the promise o...
research
04/29/2020

Expansion via Prediction of Importance with Contextualization

The identification of relevance with little textual context is a primary...
research
04/24/2022

Faster Learned Sparse Retrieval with Guided Traversal

Neural information retrieval architectures based on transformers such as...
research
03/23/2023

A Unified Framework for Learned Sparse Retrieval

Learned sparse retrieval (LSR) is a family of first-stage retrieval meth...
research
05/02/2023

Optimizing Guided Traversal for Fast Learned Sparse Retrieval

Recent studies show that BM25-driven dynamic index skipping can greatly ...
research
08/01/2023

On the Effects of Regional Spelling Conventions in Retrieval Models

One advantage of neural ranking models is that they are meant to general...

Please sign up or login with your details

Forgot password? Click here to reset