The Deep Arbitrary Polynomial Chaos Neural Network or how Deep Artificial Neural Networks could benefit from Data-Driven Homogeneous Chaos Theory

06/26/2023
by   Sergey Oladyshkin, et al.
0

Artificial Intelligence and Machine learning have been widely used in various fields of mathematical computing, physical modeling, computational science, communication science, and stochastic analysis. Approaches based on Deep Artificial Neural Networks (DANN) are very popular in our days. Depending on the learning task, the exact form of DANNs is determined via their multi-layer architecture, activation functions and the so-called loss function. However, for a majority of deep learning approaches based on DANNs, the kernel structure of neural signal processing remains the same, where the node response is encoded as a linear superposition of neural activity, while the non-linearity is triggered by the activation functions. In the current paper, we suggest to analyze the neural signal processing in DANNs from the point of view of homogeneous chaos theory as known from polynomial chaos expansion (PCE). From the PCE perspective, the (linear) response on each node of a DANN could be seen as a 1^st degree multi-variate polynomial of single neurons from the previous layer, i.e. linear weighted sum of monomials. From this point of view, the conventional DANN structure relies implicitly (but erroneously) on a Gaussian distribution of neural signals. Additionally, this view revels that by design DANNs do not necessarily fulfill any orthogonality or orthonormality condition for a majority of data-driven applications. Therefore, the prevailing handling of neural signals in DANNs could lead to redundant representation as any neural signal could contain some partial information from other neural signals. To tackle that challenge, we suggest to employ the data-driven generalization of PCE theory known as arbitrary polynomial chaos (aPC) to construct a corresponding multi-variate orthonormal representations on each node of a DANN to obtain Deep arbitrary polynomial chaos neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/28/2018

The SWAG Algorithm; a Mathematical Approach that Outperforms Traditional Deep Learning. Theory and Implementation

The performance of artificial neural networks (ANNs) is influenced by we...
research
03/29/2019

Deep Representation with ReLU Neural Networks

We consider deep feedforward neural networks with rectified linear units...
research
03/15/2022

Fast and Accurate Linear Fitting for Incompletely Sampled Gaussian Function With a Long Tail

Fitting experiment data onto a curve is a common signal processing techn...
research
11/09/2022

Enhanced Bayesian Neural Networks for Macroeconomics and Finance

We develop Bayesian neural networks (BNNs) that permit to model generic ...
research
10/17/2022

Signal Processing for Implicit Neural Representations

Implicit Neural Representations (INRs) encoding continuous multi-media d...
research
08/01/2004

Why Two Sexes?

Evolutionary role of the separation into two sexes from a cyberneticist'...
research
07/26/2011

Multi Layer Analysis

This thesis presents a new methodology to analyze one-dimensional signal...

Please sign up or login with your details

Forgot password? Click here to reset