Fractional-order Backpropagation Neural Networks: Modified Fractional-order Steepest Descent Method for Family of Backpropagation Neural Networks

06/23/2019
by   Yi-Fei PU, et al.
0

This paper offers a novel mathematical approach, the modified Fractional-order Steepest Descent Method (FSDM) for training BackPropagation Neural Networks (BPNNs); this differs from the majority of the previous approaches and as such. A promising mathematical method, fractional calculus, has the potential to assume a prominent role in the applications of neural networks and cybernetics because of its inherent strengths such as long-term memory, nonlocality, and weak singularity. Therefore, to improve the optimization performance of classic first-order BPNNs, in this paper we study whether it could be possible to modified FSDM and generalize classic first-order BPNNs to modified FSDM based Fractional-order Backpropagation Neural Networks (FBPNNs). Motivated by this inspiration, this paper proposes a state-of-the-art application of fractional calculus to implement a modified FSDM based FBPNN whose reverse incremental search is in the negative directions of the approximate fractional-order partial derivatives of the square error. At first, the theoretical concept of a modified FSDM based FBPNN is described mathematically. Then, the mathematical proof of the fractional-order global optimal convergence, an assumption of the structure, and the fractional-order multi-scale global optimization of a modified FSDM based FBPNN are analysed in detail. Finally, we perform comparative experiments and compare a modified FSDM based FBPNN with a classic first-order BPNN, i.e., an example function approximation, fractional-order multi-scale global optimization, and two comparative performances with real data. The more efficient optimal searching capability of the fractional-order multi-scale global optimization of a modified FSDM based FBPNN to determine the global optimal solution is the major advantage being superior to a classic first-order BPNN.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/19/2019

A fractional Traub method with (2α+1)th-order of convergence and its stability

Some fractional Newton methods have been proposed in order to find roots...
research
05/01/2022

Using a novel fractional-order gradient method for CNN back-propagation

Computer-aided diagnosis tools have experienced rapid growth and develop...
research
05/07/2022

A modified EM method and its fast implementation for multi-term Riemann-Liouville stochastic fractional differential equations

In this paper, a modified Euler-Maruyama (EM) method is constructed for ...
research
05/14/2019

Convolutional neural networks with fractional order gradient method

This paper proposes a fractional order gradient method for the backward ...
research
06/22/2023

A Lotka-Volterra type model analyzed through different techniques

We consider a modified Lotka-Volterra model applied to the predator-prey...
research
05/10/2019

Analysis of Probabilistic multi-scale fractional order fusion-based de-hazing algorithm

In this report, a de-hazing algorithm based on probability and multi-sca...
research
01/05/2020

Adaptive fractional order graph neural network

This paper proposes adaptive fractional order graph neural network (AFGN...

Please sign up or login with your details

Forgot password? Click here to reset