word2ket: Space-efficient Word Embeddings inspired by Quantum Entanglement

11/12/2019
by   Aliakbar Panahi, et al.
36

Deep learning natural language processing models often use vector word embeddings, such as word2vec or GloVe, to represent words. A discrete sequence of words can be much more easily integrated with downstream neural layers if it is represented as a sequence of continuous vectors. Also, semantic relationships between words, learned from a text corpus, can be encoded in the relative configurations of the embedding vectors. However, storing and accessing embedding vectors for all words in a dictionary requires large amount of space, and may stain systems with limited GPU memory. Here, we used approaches inspired by quantum computing to propose two related methods, word2ket and word2ketXS, for storing word embedding matrix during training and inference in a highly efficient way. Our approach achieves a hundred-fold or more reduction in the space required to store the embeddings with almost no relative drop in accuracy in practical natural language processing tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/18/2017

word representation or word embedding in Persian text

Text processing is one of the sub-branches of natural language processin...
research
05/29/2018

Quantum-inspired Complex Word Embedding

A challenging task for word embeddings is to capture the emergent meanin...
research
12/04/2019

Natural Alpha Embeddings

Learning an embedding for a large collection of items is a popular appro...
research
06/17/2020

On the Learnability of Concepts: With Applications to Comparing Word Embedding Algorithms

Word Embeddings are used widely in multiple Natural Language Processing ...
research
04/25/2020

All Word Embeddings from One Embedding

In neural network-based models for natural language processing (NLP), th...
research
09/08/2019

Distributed Word2Vec using Graph Analytics Frameworks

Word embeddings capture semantic and syntactic similarities of words, re...
research
06/23/2019

Smaller Text Classifiers with Discriminative Cluster Embeddings

Word embedding parameters often dominate overall model sizes in neural m...

Please sign up or login with your details

Forgot password? Click here to reset