Word2Bits - Quantized Word Vectors

03/15/2018
by   Maximilian Lam, et al.
0

Word vectors require significant amounts of memory and storage, posing issues to resource limited devices like mobile phones and GPUs. We show that high quality quantized word vectors using 1-2 bits per parameter can be learned by introducing a quantization function into Word2Vec. We furthermore show that training with the quantization function acts as a regularizer. We train word vectors on English Wikipedia (2017) and evaluate them on standard word similarity and analogy tasks and on question answering (SQuAD). Our quantized word vectors not only take 8-16x less space than full precision (32 bit) word vectors but also outperform them on word similarity tasks and question answering.

READ FULL TEXT
research
05/08/2016

Problems With Evaluation of Word Embeddings Using Word Similarity Tasks

Lacking standardized extrinsic evaluation methods for vector representat...
research
05/23/2023

Accessing Higher Dimensions for Unsupervised Word Translation

The striking ability of unsupervised word translation has been demonstra...
research
12/30/2020

A Memory Efficient Baseline for Open Domain Question Answering

Recently, retrieval systems based on dense representations have led to i...
research
07/09/2015

FAQ-based Question Answering via Word Alignment

In this paper, we propose a novel word-alignment-based method to solve t...
research
05/04/2019

SinReQ: Generalized Sinusoidal Regularization for Automatic Low-Bitwidth Deep Quantized Training

Quantization of neural networks offers significant promise in reducing t...
research
12/18/2019

Neural Networks Weights Quantization: Target None-retraining Ternary (TNT)

Quantization of weights of deep neural networks (DNN) has proven to be a...
research
05/11/2020

Multidirectional Associative Optimization of Function-Specific Word Representations

We present a neural framework for learning associations between interrel...

Please sign up or login with your details

Forgot password? Click here to reset