DeepAI AI Chat
Log In Sign Up

On Extending NLP Techniques from the Categorical to the Latent Space: KL Divergence, Zipf's Law, and Similarity Search

by   Adam Hare, et al.

Despite the recent successes of deep learning in natural language processing (NLP), there remains widespread usage of and demand for techniques that do not rely on machine learning. The advantage of these techniques is their interpretability and low cost when compared to frequently opaque and expensive machine learning models. Although they may not be be as performant in all cases, they are often sufficient for common and relatively simple problems. In this paper, we aim to modernize these older methods while retaining their advantages by extending approaches from categorical or bag-of-words representations to word embeddings representations in the latent space. First, we show that entropy and Kullback-Leibler divergence can be efficiently estimated using word embeddings and use this estimation to compare text across several categories. Next, we recast the heavy-tailed distribution known as Zipf's law that is frequently observed in the categorical space to the latent space. Finally, we look to improve the Jaccard similarity measure for sentence suggestion by introducing a new method of identifying similar sentences based on the set cover problem. We compare the performance of this algorithm against several baselines including Word Mover's Distance and the Levenshtein distance.


Semantic Structure and Interpretability of Word Embeddings

Dense word embeddings, which encode semantic meanings of words to low di...

Learning Geometric Word Meta-Embeddings

We propose a geometric framework for learning meta-embeddings of words f...

Learning Multilingual Word Embeddings in Latent Metric Space: A Geometric Approach

We propose a novel geometric approach for learning bilingual mappings gi...

Rethinking travel behavior modeling representations through embeddings

This paper introduces the concept of travel behavior embeddings, a metho...

The Immersion of Directed Multi-graphs in Embedding Fields. Generalisations

The purpose of this paper is to outline a generalised model for represen...

Evidence Transfer for Improving Clustering Tasks Using External Categorical Evidence

In this paper we introduce evidence transfer for clustering, a deep lear...

Query2Prod2Vec Grounded Word Embeddings for eCommerce

We present Query2Prod2Vec, a model that grounds lexical representations ...