
The emergent algebraic structure of RNNs and embeddings in NLP
We examine the algebraic and geometric properties of a unidirectional G...
read it

Reproducing and learning new algebraic operations on word embeddings using genetic programming
Wordvector representations associate a high dimensional realvector to ...
read it

The Representation Theory of Neural Networks
In this work, we show that neural networks can be represented via the ma...
read it

A Functional Perspective on Learning Symmetric Functions with Neural Networks
Symmetric functions, which take as input an unordered, fixedsize set, a...
read it

Transfer Learning with Binary Neural Networks
Previous work has shown that it is possible to train deep neural network...
read it

Group Equivariant Neural Architecture Search via Group Decomposition and Reinforcement Learning
Recent works show that including group equivariance as an inductive bias...
read it

Nearlossless Binarization of Word Embeddings
Is it possible to learn binary word embeddings of arbitrary size from th...
read it
Abelian Neural Networks
We study the problem of modeling a binary operation that satisfies some algebraic requirements. We first construct a neural network architecture for Abelian group operations and derive a universal approximation property. Then, we extend it to Abelian semigroup operations using the characterization of associative symmetric polynomials. Both models take advantage of the analytic invertibility of invertible neural networks. For each case, by repeating the binary operations, we can represent a function for multiset input thanks to the algebraic structure. Naturally, our multiset architecture has sizegeneralization ability, which has not been obtained in existing methods. Further, we present modeling the Abelian group operation itself is useful in a word analogy task. We train our models over fixed word embeddings and demonstrate improved performance over the original word2vec and another naive learning method.
READ FULL TEXT
Comments
There are no comments yet.