Abelian Neural Networks

by   Kenshin Abe, et al.

We study the problem of modeling a binary operation that satisfies some algebraic requirements. We first construct a neural network architecture for Abelian group operations and derive a universal approximation property. Then, we extend it to Abelian semigroup operations using the characterization of associative symmetric polynomials. Both models take advantage of the analytic invertibility of invertible neural networks. For each case, by repeating the binary operations, we can represent a function for multiset input thanks to the algebraic structure. Naturally, our multiset architecture has size-generalization ability, which has not been obtained in existing methods. Further, we present modeling the Abelian group operation itself is useful in a word analogy task. We train our models over fixed word embeddings and demonstrate improved performance over the original word2vec and another naive learning method.


page 1

page 2

page 3

page 4


The emergent algebraic structure of RNNs and embeddings in NLP

We examine the algebraic and geometric properties of a uni-directional G...

Exponential Separations in Symmetric Neural Networks

In this work we demonstrate a novel separation between symmetric neural ...

Reproducing and learning new algebraic operations on word embeddings using genetic programming

Word-vector representations associate a high dimensional real-vector to ...

A Functional Perspective on Learning Symmetric Functions with Neural Networks

Symmetric functions, which take as input an unordered, fixed-size set, a...

Transfer Learning with Binary Neural Networks

Previous work has shown that it is possible to train deep neural network...

Revisiting Additive Compositionality: AND, OR and NOT Operations with Word Embeddings

It is well-known that typical word embedding methods such as Word2Vec an...

Faster Modular Composition

A new Las Vegas algorithm is presented for the composition of two polyno...