Safe Use of Neural Networks

06/13/2023
by   George Redinbo, et al.
0

Neural networks in modern communication systems can be susceptible to internal numerical errors that can drastically effect decision results. Such structures are composed of many sections each of which generally contain weighting operations and activation function evaluations. The safe use comes from methods employing number based codes that can detect arithmetic errors in the network's processing steps. Each set of operations generates parity values dictated by a code in two ways. One set of parities is obtained from a section's outputs while a second comparable set is developed directly from the original inputs. The parity values protecting the activation functions involve a Taylor series approximation to the activation functions. We focus on using long numerically based convolutional codes because of the large size of data sets. The codes are based on Discrete Fourier Transform kernels and there are many design options available. Mathematical program simulations show our error-detecting techniques are effective and efficient.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/28/2020

Trainable Activation Function in Image Classification

In the current research of neural networks, the activation function is m...
research
04/28/2020

Trainable Activation Function Supported CNN in Image Classification

In the current research of neural networks, the activation function is m...
research
03/21/2017

Evolving Parsimonious Networks by Mixing Activation Functions

Neuroevolution methods evolve the weights of a neural network, and in so...
research
07/13/2022

MorphoActivation: Generalizing ReLU activation function by mathematical morphology

This paper analyses both nonlinear activation functions and spatial max-...
research
11/16/2016

Spectral Convolution Networks

Previous research has shown that computation of convolution in the frequ...
research
10/27/2018

On the Equivalence of Convolutional and Hadamard Networks using DFT

In this paper we introduce activation functions that move the entire com...

Please sign up or login with your details

Forgot password? Click here to reset