Neural networks with dynamical coefficients and adjustable connections on the basis of integrated backpropagation

05/19/2018
by   M. N. Nazarov, et al.
0

We consider artificial neurons which will update their weight coefficients with internal rule based on backpropagation, rather than using it as an external training procedure. To achieve this we include the backpropagation error estimate as a separate entity in all the neuron models and perform its exchange along the synaptic connections. In addition to this we add some special type of neurons with reference inputs, which will serve as a base source of error estimates for the whole network. Finally, we introduce a training control signal for all the neurons, which can enable the correction of weights and the exchange of error estimates. For recurrent neural networks we also demonstrate how to include backpropagation through time into their formalism with the help of some stack memory for reference inputs and external data inputs of neurons. As a useful consequence, our approach enables us to introduce neural networks with the adjustment of synaptic connections, tied to the incorporated backpropagation. Also, for widely used neural networks, such as long short-term memory, radial basis function networks, multilayer perceptrons and convolutional neural networks we demonstrate their alternative description within the framework of our new formalism.

READ FULL TEXT
research
11/22/2015

An Approximate Backpropagation Learning Rule for Memristor Based Neural Networks Using Synaptic Plasticity

We describe an approximation to backpropagation algorithm for training d...
research
02/08/2021

Derivation of the Backpropagation Algorithm Based on Derivative Amplification Coefficients

The backpropagation algorithm for neural networks is widely felt hard to...
research
11/17/2022

Learning to Control Rapidly Changing Synaptic Connections: An Alternative Type of Memory in Sequence Processing Artificial Neural Networks

Short-term memory in standard, general-purpose, sequence-processing recu...
research
09/25/2017

Robust Associative Memories Naturally Occuring From Recurrent Hebbian Networks Under Noise

The brain is a noisy system subject to energy constraints. These facts a...
research
08/17/2017

General Backpropagation Algorithm for Training Second-order Neural Networks

The artificial neural network is a popular framework in machine learning...
research
10/04/2020

New Insights on Learning Rules for Hopfield Networks: Memory and Objective Function Minimisation

Hopfield neural networks are a possible basis for modelling associative ...
research
06/12/2018

A Connectome Based Hexagonal Lattice Convolutional Network Model of the Drosophila Visual System

What can we learn from a connectome? We constructed a simplified model o...

Please sign up or login with your details

Forgot password? Click here to reset