Abelian Neural Networks

by   Kenshin Abe, et al.

We study the problem of modeling a binary operation that satisfies some algebraic requirements. We first construct a neural network architecture for Abelian group operations and derive a universal approximation property. Then, we extend it to Abelian semigroup operations using the characterization of associative symmetric polynomials. Both models take advantage of the analytic invertibility of invertible neural networks. For each case, by repeating the binary operations, we can represent a function for multiset input thanks to the algebraic structure. Naturally, our multiset architecture has size-generalization ability, which has not been obtained in existing methods. Further, we present modeling the Abelian group operation itself is useful in a word analogy task. We train our models over fixed word embeddings and demonstrate improved performance over the original word2vec and another naive learning method.


page 1

page 2

page 3

page 4


The emergent algebraic structure of RNNs and embeddings in NLP

We examine the algebraic and geometric properties of a uni-directional G...

Exponential Separations in Symmetric Neural Networks

In this work we demonstrate a novel separation between symmetric neural ...

Reproducing and learning new algebraic operations on word embeddings using genetic programming

Word-vector representations associate a high dimensional real-vector to ...

The Representation Theory of Neural Networks

In this work, we show that neural networks can be represented via the ma...

A Functional Perspective on Learning Symmetric Functions with Neural Networks

Symmetric functions, which take as input an unordered, fixed-size set, a...

Why Neural Networks Work

We argue that many properties of fully-connected feedforward neural netw...

Near-lossless Binarization of Word Embeddings

Is it possible to learn binary word embeddings of arbitrary size from th...

Please sign up or login with your details

Forgot password? Click here to reset