AlgebraNets

06/12/2020
by   Jordan Hoffmann, et al.
0

Neural networks have historically been built layerwise from the set of functions in f: R^n →R^m, i.e. with activations and weights/parameters represented by real numbers, R. Our work considers a richer set of objects for activations and weights, and undertakes a comprehensive study of alternative algebras as number representations by studying their performance on two challenging problems: large-scale image classification using the ImageNet dataset and language modeling using the enwiki8 and WikiText-103 datasets. We denote this broader class of models as AlgebraNets. Our findings indicate that the conclusions of prior work, which explored neural networks constructed from C (complex numbers) and H (quaternions) on smaller datasets, do not always transfer to these challenging settings. However, our results demonstrate that there are alternative algebras which deliver better parameter and computational efficiency compared with R. We consider C, H, M_2(R) (the set of 2×2 real-valued matrices), M_2(C), M_3(R) and M_4(R). Additionally, we note that multiplication in these algebras has higher compute density than real multiplication, a useful property in situations with inherently limited parameter reuse such as auto-regressive inference and sparse neural networks. We therefore investigate how to induce sparsity within AlgebraNets. We hope that our strong results on large-scale, practical benchmarks will spur further exploration of these unconventional architectures which challenge the default choice of using real numbers for neural network weights and activations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/13/2023

Neural Networks at a Fraction with Pruned Quaternions

Contemporary state-of-the-art neural networks have increasingly large nu...
research
01/03/2021

Algorithmic Complexities in Backpropagation and Tropical Neural Networks

In this note, we propose a novel technique to reduce the algorithmic com...
research
08/17/2022

AdaBin: Improving Binary Neural Networks with Adaptive Binary Sets

This paper studies the Binary Neural Networks (BNNs) in which weights an...
research
06/28/2017

Toward Computation and Memory Efficient Neural Network Acoustic Models with Binary Weights and Activations

Neural network acoustic models have significantly advanced state of the ...
research
04/16/2019

Matrix and tensor decompositions for training binary neural networks

This paper is on improving the training of binary neural networks in whi...
research
03/05/2019

TinBiNN: Tiny Binarized Neural Network Overlay in about 5,000 4-LUTs and 5mW

Reduced-precision arithmetic improves the size, cost, power and performa...
research
04/19/2021

RingCNN: Exploiting Algebraically-Sparse Ring Tensors for Energy-Efficient CNN-Based Computational Imaging

In the era of artificial intelligence, convolutional neural networks (CN...

Please sign up or login with your details

Forgot password? Click here to reset