Densifying Sparse Representations for Passage Retrieval by Representational Slicing

12/09/2021
by   Sheng-Chieh Lin, et al.
0

Learned sparse and dense representations capture different successful approaches to text retrieval and the fusion of their results has proven to be more effective and robust. Prior work combines dense and sparse retrievers by fusing their model scores. As an alternative, this paper presents a simple approach to densifying sparse representations for text retrieval that does not involve any training. Our densified sparse representations (DSRs) are interpretable and can be easily combined with dense representations for end-to-end retrieval. We demonstrate that our approach can jointly learn sparse and dense representations within a single model and then combine them for dense retrieval. Experimental results suggest that combining our DSRs and dense representations yields a balanced tradeoff between effectiveness and efficiency.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/20/2022

A Dense Representation Framework for Lexical and Semantic Matching

Lexical and semantic matching capture different successful approaches to...
research
12/28/2020

The Curse of Dense Low-Dimensional Information Retrieval for Large Index Sizes

Information Retrieval using dense low-dimensional representations recent...
research
11/07/2019

Transformation of Dense and Sparse Text Representations

Sparsity is regarded as a desirable property of representations, especia...
research
04/12/2020

Minimizing FLOPs to Learn Efficient Sparse Representations

Deep representation learning has become one of the most widely adopted a...
research
04/12/2021

A Replication Study of Dense Passage Retriever

Text retrieval using learned dense representations has recently emerged ...
research
03/27/2019

How Can We Be So Dense? The Benefits of Using Highly Sparse Representations

Most artificial networks today rely on dense representations, whereas bi...
research
12/17/2021

Sparsifying Sparse Representations for Passage Retrieval by Top-k Masking

Sparse lexical representation learning has demonstrated much progress in...

Please sign up or login with your details

Forgot password? Click here to reset