An Informal Introduction to Multiplet Neural Networks

06/02/2020
by   Nathan E. Frick, et al.
0

In the artificial neuron, I replace the dot product with the weighted Lehmer mean, which may emulate different cases of a generalized mean. The single neuron instance is replaced by a multiplet of neurons which have the same averaging weights. A group of outputs feed forward, in lieu of the single scalar. The generalization parameter is typically set to a different value for each neuron in the multiplet. I further extend the concept to a multiplet taken from the Gini mean. Derivatives with respect to the weight parameters and with respect to the two generalization parameters are given. Some properties of the network are investigated, showing the capacity to emulate the classical exclusive-or problem organically in two layers and perform some multiplication and division. The network can instantiate truncated power series and variants, which can be used to approximate different functions, provided that parameters are constrained. Moreover, a mean case slope score is derived that can facilitate a learning-rate novelty based on homogeneity of the selected elements. The multiplet neuron equation provides a way to segment regularization timeframes and approaches.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/07/2021

SeReNe: Sensitivity based Regularization of Neurons for Structured Sparsity in Neural Networks

Deep neural networks include millions of learnable parameters, making th...
research
05/11/2019

Deep Learning: a new definition of artificial neuron with double weight

Deep learning is a subset of a broader family of machine learning method...
research
10/05/2021

NEWRON: A New Generalization of the Artificial Neuron to Enhance the Interpretability of Neural Networks

In this work, we formulate NEWRON: a generalization of the McCulloch-Pit...
research
04/08/2020

Flexible Transmitter Network

Current neural networks are mostly built upon the MP model, which usuall...
research
12/14/2020

Constraints on Hebbian and STDP learned weights of a spiking neuron

We analyse mathematically the constraints on weights resulting from Hebb...
research
03/18/2014

Similarity networks for classification: a case study in the Horse Colic problem

This paper develops a two-layer neural network in which the neuron model...
research
07/10/2017

Scale-Regularized Filter Learning

We start out by demonstrating that an elementary learning task, correspo...

Please sign up or login with your details

Forgot password? Click here to reset