A Polynomial Neural Network with Controllable Precision and Human-Readable Topology for Prediction and System Identification

04/08/2020
by   Gang Liu, et al.
0

Although the success of artificial neural networks (ANNs), there is still a concern among many over their "black box" nature. Why do they work? Could we design a "transparent" network? This paper presents a controllable and readable polynomial neural network (CR-PNN) for approximation, prediction, and system identification. CR-PNN is simple enough to be described as one "small" formula so that we can control the approximation precision and explain the internal structure of the network. CR-PNN, in fact, essentially is the fascinating Taylor expansion in the form of network. The number of layers represents precision. Derivatives in Taylor expansion are exactly imitated by error back-propagation algorithm. Firstly, we demonstrated that CR-PNN shows excellent analysis performance to the "black box" system through ten synthetic data with noise. Also, the results were compared with synthetic data to substantiate its search towards the global optimum. Secondly, it was verified, by ten real-world applications, that CR-PNN brought better generalization capability relative to the typical ANNs that approximate depended on the nonlinear activation function. Finally, 200,000 repeated experiments, with 4898 samples, demonstrated that CR-PNN is five times more efficient than typical ANN for one epoch and ten times more efficient than typical ANN for one forward-propagation. In short, compared with the traditional neural networks, the novelties and advantages of CR-PNN include readability of the internal structure, guarantees of a globally optimal solution, lower computational complexity, and likely better robustness to real-world approximation.(We're strong believers in Open Source, and provide CR-PNN code for others. GitHub: https://github.com/liugang1234567/CR-PNN#cr-pnn)

READ FULL TEXT
research
06/04/2020

A Polynomial Neural network with Controllable Precision and Human-Readable Topology II: Accelerated Approach Based on Expanded Layer

How about converting Taylor series to a network to solve the black-box n...
research
07/17/2023

HOPE: High-order Polynomial Expansion of Black-box Neural Networks

Despite their remarkable performance, deep neural networks remain mostly...
research
06/13/2018

Polynomial Regression As an Alternative to Neural Nets

Despite the success of neural networks (NNs), there is still a concern a...
research
06/09/2020

Neural Network Activation Quantization with Bitwise Information Bottlenecks

Recent researches on information bottleneck shed new light on the contin...
research
09/08/2022

NeuralFMU: Presenting a workflow for integrating hybrid NeuralODEs into real world applications

The term NeuralODE describes the structural combination of an Artifical ...
research
02/07/2021

Towards a mathematical framework to inform Neural Network modelling via Polynomial Regression

Even when neural networks are widely used in a large number of applicati...

Please sign up or login with your details

Forgot password? Click here to reset