The general aspects of noise in analogue hardware deep neural networks

03/12/2021
by   Nadezhda Semenova, et al.
0

Deep neural networks unlocked a vast range of new applications by solving tasks of which many were previouslydeemed as reserved to higher human intelligence. One of the developments enabling this success was a boost incomputing power provided by special purpose hardware, such as graphic or tensor processing units. However,these do not leverage fundamental features of neural networks like parallelism and analog state variables.Instead, they emulate neural networks relying on computing power, which results in unsustainable energyconsumption and comparatively low speed. Fully parallel and analogue hardware promises to overcomethese challenges, yet the impact of analogue neuron noise and its propagation, i.e. accumulation, threatensrendering such approaches inept. Here, we analyse for the first time the propagation of noise in paralleldeep neural networks comprising noisy nonlinear neurons. We develop an analytical treatment for both,symmetric networks to highlight the underlying mechanisms, and networks trained with back propagation.We find that noise accumulation is generally bound, and adding additional network layers does not worsenthe signal to noise ratio beyond this limit. Most importantly, noise accumulation can be suppressed entirelywhen neuron activation functions have a slope smaller than unity. We therefore developed the frameworkfor noise of deep neural networks implemented in analog systems, and identify criteria allowing engineers todesign noise-resilient novel neural network hardware.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/21/2019

Fundamental aspects of noise in analog-hardware neural networks

We study and analyze the fundamental aspects of noise propagation in rec...
research
07/17/2019

Noise Analysis of Photonic Modulator Neurons

Neuromorphic photonics relies on efficiently emulating analog neural net...
research
04/20/2022

Noise mitigation strategies in physical feedforward neural networks

Physical neural networks are promising candidates for next generation ar...
research
03/23/2023

Noise impact on recurrent neural network with linear activation function

In recent years, more and more researchers in the field of neural networ...
research
07/28/2023

Quantum-noise-limited optical neural networks operating at a few quanta per activation

Analog physical neural networks, which hold promise for improved energy ...
research
07/07/2020

Calibrated BatchNorm: Improving Robustness Against Noisy Weights in Neural Networks

Analog computing hardware has gradually received more attention by the r...
research
03/27/2020

Boolean learning under noise-perturbations in hardware neural networks

A high efficiency hardware integration of neural networks benefits from ...

Please sign up or login with your details

Forgot password? Click here to reset