Compressing Word Embeddings via Deep Compositional Code Learning

11/03/2017
by   Raphael Shu, et al.
0

Natural language processing (NLP) models often require a massive number of parameters for word embeddings, resulting in a large storage or memory footprint. Deploying neural NLP models to mobile devices requires compressing the word embeddings without any significant sacrifices in performance. For this purpose, we propose to construct the embeddings with few basis vectors. For each word, the composition of basis vectors is determined by a hash code. To maximize the compression rate, we adopt the multi-codebook quantization approach instead of binary coding scheme. Each code is composed of multiple discrete numbers, such as (3, 2, 1, 8), where the value of each component is limited to a fixed range. We propose to directly learn the discrete codes in an end-to-end neural network by applying the Gumbel-softmax trick. Experiments show the compression rate achieves 98 99 proposed method can improve the model performance by slightly lowering the compression rate. Compared to other approaches such as character-level segmentation, the proposed method is language-independent and does not require modifications to the network architecture.

READ FULL TEXT
research
05/05/2021

Evaluation Of Word Embeddings From Large-Scale French Web Content

Distributed word representations are popularly used in many tasks in nat...
research
06/15/2021

Direction is what you need: Improving Word Embedding Compression in Large Language Models

The adoption of Transformer-based models in natural language processing ...
research
04/25/2020

All Word Embeddings from One Embedding

In neural network-based models for natural language processing (NLP), th...
research
01/14/2020

Balancing the composition of word embeddings across heterogenous data sets

Word embeddings capture semantic relationships based on contextual infor...
research
06/13/2017

Getting deep recommenders fit: Bloom embeddings for sparse binary input/output networks

Recommendation algorithms that incorporate techniques from deep learning...
research
03/24/2018

Near-lossless Binarization of Word Embeddings

Is it possible to learn binary word embeddings of arbitrary size from th...
research
09/03/2019

On the Downstream Performance of Compressed Word Embeddings

Compressing word embeddings is important for deploying NLP models in mem...

Please sign up or login with your details

Forgot password? Click here to reset