Transformation of Dense and Sparse Text Representations

11/07/2019
by   Wenpeng Hu, et al.
0

Sparsity is regarded as a desirable property of representations, especially in terms of explanation. However, its usage has been limited due to the gap with dense representations. Most NLP research progresses in recent years are based on dense representations. Thus the desirable property of sparsity cannot be leveraged. Inspired by Fourier Transformation, in this paper, we propose a novel Semantic Transformation method to bridge the dense and sparse spaces, which can facilitate the NLP research to shift from dense space to sparse space or to jointly use both spaces. The key idea of the proposed approach is to use a Forward Transformation to transform dense representations to sparse representations. Then some useful operations in the sparse space can be performed over the sparse representations, and the sparse representations can be used directly to perform downstream tasks such as text classification and natural language inference. Then, a Backward Transformation can also be carried out to transform those processed sparse representations to dense representations. Experiments using classification tasks and natural language inference task show that the proposed Semantic Transformation is effective.

READ FULL TEXT

page 2

page 3

page 4

page 5

page 6

page 7

page 9

page 10

research
12/09/2021

Densifying Sparse Representations for Passage Retrieval by Representational Slicing

Learned sparse and dense representations capture different successful ap...
research
03/18/2020

Anchor Transform: Learning Sparse Representations of Discrete Objects

Learning continuous representations of discrete objects such as text, us...
research
03/21/2023

Sparse Iso-FLOP Transformations for Maximizing Training Efficiency

Recent works have explored the use of weight sparsity to improve the tra...
research
09/27/2021

Language Invariant Properties in Natural Language Processing

Meaning is context-dependent, but many properties of language (should) r...
research
01/26/2023

SparDA: Accelerating Dynamic Sparse Deep Neural Networks via Sparse-Dense Transformation

Due to its high cost-effectiveness, sparsity has become the most importa...
research
03/27/2019

How Can We Be So Dense? The Benefits of Using Highly Sparse Representations

Most artificial networks today rely on dense representations, whereas bi...
research
06/02/2021

Is Sparse Attention more Interpretable?

Sparse attention has been claimed to increase model interpretability und...

Please sign up or login with your details

Forgot password? Click here to reset