Recurrent Neural Networks With Limited Numerical Precision

08/24/2016
by   Joachim Ott, et al.
0

Recurrent Neural Networks (RNNs) produce state-of-art performance on many machine learning tasks but their demand on resources in terms of memory and computational power are often high. Therefore, there is a great interest in optimizing the computations performed with these models especially when considering development of specialized low-power hardware for deep networks. One way of reducing the computational needs is to limit the numerical precision of the network weights and biases. This has led to different proposed rounding methods which have been applied so far to only Convolutional Neural Networks and Fully-Connected Networks. This paper addresses the question of how to best reduce weight precision during training in the case of RNNs. We present results from the use of different stochastic and deterministic reduced precision training methods applied to three major RNN types which are then tested on several datasets. The results show that the weight binarization methods do not work with the RNNs. However, the stochastic and deterministic ternarization, and pow2-ternarization methods gave rise to low-precision RNNs that produce similar and even higher accuracy on certain datasets therefore providing a path towards training more efficient implementations of RNNs in specialized hardware.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/20/2017

Low Precision RNNs: Quantizing RNNs Without Losing Accuracy

Similar to convolution neural networks, recurrent neural networks (RNNs)...
research
05/29/2019

Rethinking Full Connectivity in Recurrent Neural Networks

Recurrent neural networks (RNNs) are omnipresent in sequence modeling ta...
research
05/13/2018

On the Practical Computational Power of Finite Precision RNNs for Language Recognition

While Recurrent Neural Networks (RNNs) are famously known to be Turing c...
research
06/01/2018

Training LSTM Networks with Resistive Cross-Point Devices

In our previous work we have shown that resistive cross point devices, s...
research
08/06/2021

Path classification by stochastic linear recurrent neural networks

We investigate the functioning of a classifying biological neural networ...
research
07/17/2018

Training Recurrent Neural Networks against Noisy Computations during Inference

We explore the robustness of recurrent neural networks when the computat...
research
07/23/2019

Recurrent Neural Networks: An Embedded Computing Perspective

Recurrent Neural Networks (RNNs) are a class of machine learning algorit...

Please sign up or login with your details

Forgot password? Click here to reset