Multilayer Perceptron Algebra

01/18/2017
by   Zhao Peng, et al.
0

Artificial Neural Networks(ANN) has been phenomenally successful on various pattern recognition tasks. However, the design of neural networks rely heavily on the experience and intuitions of individual developers. In this article, the author introduces a mathematical structure called MLP algebra on the set of all Multilayer Perceptron Neural Networks(MLP), which can serve as a guiding principle to build MLPs accommodating to the particular data sets, and to build complex MLPs from simpler ones.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/05/2010

A Parallel Framework for Multilayer Perceptron for Human Face Recognition

Artificial neural networks have already shown their success in face reco...
research
02/04/2022

A note on the complex and bicomplex valued neural networks

In this paper we first write a proof of the perceptron convergence algor...
research
04/02/2016

Channel Equalization Using Multilayer Perceptron Networks

In most digital communication systems, bandwidth limited channel along w...
research
11/25/2019

Shenjing: A low power reconfigurable neuromorphic accelerator with partial-sum and spike networks-on-chip

The next wave of on-device AI will likely require energy-efficient deep ...
research
08/21/2016

Neural Networks and Chaos: Construction, Evaluation of Chaotic Networks, and Prediction of Chaos with Multilayer Feedforward Networks

Many research works deal with chaotic neural networks for various fields...
research
07/09/2017

Deepest Neural Networks

This paper shows that a long chain of perceptrons (that is, a multilayer...

Please sign up or login with your details

Forgot password? Click here to reset