On the Robustness of Projection Neural Networks For Efficient Text Representation: An Empirical Study

08/14/2019
by   Chinnadhurai Sankar, et al.
0

Recently, there has been strong interest in developing natural language applications that live on personal devices such as mobile phones, watches and IoT with the objective to preserve user privacy and have low memory. Advances in Locality-Sensitive Hashing (LSH)-based projection networks have demonstrated state-of-the-art performance without any embedding lookup tables and instead computing on-the-fly text representations. However, previous works have not investigated "What makes projection neural networks effective at capturing compact representations for text classification?" and "Are these projection models resistant to perturbations and misspellings in input text?". In this paper, we analyze and answer these questions through perturbation analyses and by running experiments on multiple dialog act prediction tasks. Our results show that the projections are resistant to perturbations and misspellings compared to widely-used recurrent architectures that use word embeddings. On ATIS intent prediction task, when evaluated with perturbed input data, we observe that the performance of recurrent models that use word embeddings drops significantly by more than 30 projection networks, showing that LSH-based projection representations are robust and consistently lead to high quality performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/13/2020

ProFormer: Towards On-Device LSH Projection Based Transformers

At the heart of text based neural models lay word representations, which...
research
06/04/2019

Transferable Neural Projection Representations

Neural word representations are at the core of many state-of-the-art nat...
research
03/10/2020

Text classification with word embedding regularization and soft similarity measure

Since the seminal work of Mikolov et al., word embeddings have become th...
research
08/02/2017

ProjectionNet: Learning Efficient On-Device Deep Networks Using Neural Projections

Deep neural networks have become ubiquitous for applications related to ...
research
02/19/2022

Data-Driven Mitigation of Adversarial Text Perturbation

Social networks have become an indispensable part of our lives, with bil...
research
06/21/2016

An empirical study on large scale text classification with skip-gram embeddings

We investigate the integration of word embeddings as classification feat...
research
07/27/2019

Modeling Winner-Take-All Competition in Sparse Binary Projections

Inspired by the advances in biological science, the study of sparse bina...

Please sign up or login with your details

Forgot password? Click here to reset