Continuous Function Structured in Multilayer Perceptron for Global Optimization

03/07/2023
by   Heeyuen Koh, et al.
0

The gradient information of multilayer perceptron with a linear neuron is modified with functional derivative for the global minimum search benchmarking problems. From this approach, we show that the landscape of the gradient derived from given continuous function using functional derivative can be the MLP-like form with ax+b neurons. In this extent, the suggested algorithm improves the availability of the optimization process to deal all the parameters in the problem set simultaneously. The functionality of this method could be improved through intentionally designed convex function with Kullack-Liebler divergence applied to cost value as well.

READ FULL TEXT
research
02/03/2020

CMOS-Free Multilayer Perceptron Enabled by Four-Terminal MTJ Device

Neuromorphic computing promises revolutionary improvements over conventi...
research
04/13/2018

Heterogeneous Multilayer Generalized Operational Perceptron

The traditional Multilayer Perceptron (MLP) using McCulloch-Pitts neuron...
research
06/11/2020

Embed Me If You Can: A Geometric Perceptron

Solving geometric tasks using machine learning is a challenging problem....
research
10/06/2019

Auto-Rotating Perceptrons

This paper proposes an improved design of the perceptron unit to mitigat...
research
10/28/2022

Hierarchical Automatic Power Plane Generation with Genetic Optimization and Multilayer Perceptron

We present an automatic multilayer power plane generation method to acce...
research
08/01/2023

Divergence of the ADAM algorithm with fixed-stepsize: a (very) simple example

A very simple unidimensional function with Lipschitz continuous gradient...

Please sign up or login with your details

Forgot password? Click here to reset