Training Recurrent Neural Networks against Noisy Computations during Inference

07/17/2018
by   Minghai Qin, et al.
2

We explore the robustness of recurrent neural networks when the computations within the network are noisy. One of the motivations for looking into this problem is to reduce the high power cost of conventional computing of neural network operations through the use of analog neuromorphic circuits. Traditional GPU/CPU-centered deep learning architectures exhibit bottlenecks in power-restricted applications, such as speech recognition in embedded systems. The use of specialized neuromorphic circuits, where analog signals passed through memory-cell arrays are sensed to accomplish matrix-vector multiplications, promises large power savings and speed gains but brings with it the problems of limited precision of computations and unavoidable analog noise. In this paper we propose a method, called Deep Noise Injection training, to train RNNs to obtain a set of weights/biases that is much more robust against noisy computation during inference. We explore several RNN architectures, such as vanilla RNN and long-short-term memories (LSTM), and show that after convergence of Deep Noise Injection training the set of trained weights/biases has more consistent performance over a wide range of noise powers entering the network during inference. Surprisingly, we find that Deep Noise Injection training improves overall performance of some networks even for numerically accurate inference.

READ FULL TEXT

page 5

page 8

research
11/26/2018

Noisy Computations during Inference: Harmful or Helpful?

We study two aspects of noisy computations during inference. The first a...
research
07/25/2019

Adaptive Noise Injection: A Structure-Expanding Regularization for RNN

The vanilla LSTM has become one of the most potential architectures in w...
research
04/05/2019

Short note on the behavior of recurrent neural network for noisy dynamical system

The behavior of recurrent neural network for the data-driven simulation ...
research
12/20/2022

Walking Noise: Understanding Implications of Noisy Computations on Classification Tasks

Machine learning methods like neural networks are extremely successful a...
research
07/07/2020

Calibrated BatchNorm: Improving Robustness Against Noisy Weights in Neural Networks

Analog computing hardware has gradually received more attention by the r...
research
02/26/2023

Training neural networks with structured noise improves classification and generalization

The beneficial role of noise in learning is nowadays a consolidated conc...
research
08/24/2016

Recurrent Neural Networks With Limited Numerical Precision

Recurrent Neural Networks (RNNs) produce state-of-art performance on man...

Please sign up or login with your details

Forgot password? Click here to reset