Central and Non-central Limit Theorems arising from the Scattering Transform and its Neural Activation Generalization

11/21/2020
by   Gi-Ren Liu, et al.
0

Motivated by analyzing complicated and non-stationary time series, we study a generalization of the scattering transform (ST) that includes broad neural activation functions, which is called neural activation ST (NAST). On the whole, NAST is a transform that comprises a sequence of “neural processing units”, each of which applies a high pass filter to the input from the previous layer followed by a composition with a nonlinear function as the output to the next neuron. Here, the nonlinear function models how a neuron gets excited by the input signal. In addition to showing properties like non-expansion, horizontal translational invariability and insensitivity to local deformation, the statistical properties of the second order NAST of a Gaussian process with various dependence and (non-)stationarity structure and its interaction with the chosen high pass filters and activation functions are explored and central limit theorem (CLT) and non-CLT results are provided. Numerical simulations are also provided. The results explain how NAST processes complicated and non-stationary time series, and pave a way towards statistical inference based on NAST under the non-null case.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/21/2020

Efficient Neural Network Implementation with Quadratic Neuron

Previous works proved that the combination of the linear neuron network ...
research
04/21/2019

Inference of synchrosqueezing transform -- toward a unified statistical analysis of nonlinear-type time-frequency analysis

We provide a statistical analysis of a tool in nonlinear-type time-frequ...
research
05/22/2019

Effect of shapes of activation functions on predictability in the echo state network

We investigate prediction accuracy for time series of Echo state network...
research
11/29/2017

Gaussian Process Neurons Learn Stochastic Activation Functions

We propose stochastic, non-parametric activation functions that are full...
research
05/13/2018

Lehmer Transform and its Theoretical Properties

We propose a new class of transforms that we call Lehmer Transform whic...
research
06/01/2022

Modified Galton-Watson processes with immigration under an alternative offspring mechanism

We propose a novel class of count time series models alternative to the ...
research
03/28/2019

On the Stability and Generalization of Learning with Kernel Activation Functions

In this brief we investigate the generalization properties of a recently...

Please sign up or login with your details

Forgot password? Click here to reset