Walking Noise: Understanding Implications of Noisy Computations on Classification Tasks

12/20/2022
by   Hendrik Borras, et al.
0

Machine learning methods like neural networks are extremely successful and popular in a variety of applications, however, they come at substantial computational costs, accompanied by high energy demands. In contrast, hardware capabilities are limited and there is evidence that technology scaling is stuttering, therefore, new approaches to meet the performance demands of increasingly complex model architectures are required. As an unsafe optimization, noisy computations are more energy efficient, and given a fixed power budget also more time efficient. However, any kind of unsafe optimization requires counter measures to ensure functionally correct results. This work considers noisy computations in an abstract form, and gears to understand the implications of such noise on the accuracy of neural-network-based classifiers as an exemplary workload. We propose a methodology called "Walking Noise" that allows to assess the robustness of different layers of deep architectures by means of a so-called "midpoint noise level" metric. We then investigate the implications of additive and multiplicative noise for different classification tasks and model architectures, with and without batch normalization. While noisy training significantly increases robustness for both noise types, we observe a clear trend to increase weights and thus increase the signal-to-noise ratio for additive noise injection. For the multiplicative case, we find that some networks, with suitably simple tasks, automatically learn an internal binary representation, hence becoming extremely robust. Overall this work proposes a method to measure the layer-specific robustness and shares first insights on how networks learn to compensate injected noise, and thus, contributes to understand robustness against noisy computations.

READ FULL TEXT

page 1

page 4

page 6

page 10

page 11

research
09/19/2018

Removing the Feature Correlation Effect of Multiplicative Noise

Multiplicative noise, including dropout, is widely used to regularize de...
research
07/21/2019

Fundamental aspects of noise in analog-hardware neural networks

We study and analyze the fundamental aspects of noise propagation in rec...
research
07/17/2018

Training Recurrent Neural Networks against Noisy Computations during Inference

We explore the robustness of recurrent neural networks when the computat...
research
06/10/2015

A Scale Mixture Perspective of Multiplicative Noise in Neural Networks

Corrupting the input and hidden layers of deep neural networks (DNNs) wi...
research
03/23/2023

Noise impact on recurrent neural network with linear activation function

In recent years, more and more researchers in the field of neural networ...
research
11/26/2018

Noisy Computations during Inference: Harmful or Helpful?

We study two aspects of noisy computations during inference. The first a...
research
01/26/2013

Neural Networks Built from Unreliable Components

Recent advances in associative memory design through strutured pattern s...

Please sign up or login with your details

Forgot password? Click here to reset