DeepAI AI Chat
Log In Sign Up

The emergent algebraic structure of RNNs and embeddings in NLP

03/07/2018
by   Sean A. Cantrell, et al.
0

We examine the algebraic and geometric properties of a uni-directional GRU and word embeddings trained end-to-end on a text classification task. A hyperparameter search over word embedding dimension, GRU hidden dimension, and a linear combination of the GRU outputs is performed. We conclude that words naturally embed themselves in a Lie group and that RNNs form a nonlinear representation of the group. Appealing to these results, we propose a novel class of recurrent-like neural networks and a word embedding scheme.

READ FULL TEXT

page 8

page 10

page 11

11/12/2020

Deconstructing word embedding algorithms

Word embeddings are reliable feature representations of words used to ob...
02/24/2021

Abelian Neural Networks

We study the problem of modeling a binary operation that satisfies some ...
07/02/2018

Transparent, Efficient, and Robust Word Embedding Access with WOMBAT

We present WOMBAT, a Python tool which supports NLP practitioners in acc...
03/10/2020

Text classification with word embedding regularization and soft similarity measure

Since the seminal work of Mikolov et al., word embeddings have become th...
11/29/2016

Identity-sensitive Word Embedding through Heterogeneous Networks

Most existing word embedding approaches do not distinguish the same word...
05/09/2017

DeepTingle

DeepTingle is a text prediction and classification system trained on the...
12/06/2018

Neural Word Search in Historical Manuscript Collections

We address the problem of segmenting and retrieving word images in colle...