Theoretical analysis of deep neural networks for temporally dependent observations

10/20/2022
by   Mingliang Ma, et al.
0

Deep neural networks are powerful tools to model observations over time with non-linear patterns. Despite the widespread use of neural networks in such settings, most theoretical developments of deep neural networks are under the assumption of independent observations, and theoretical results for temporally dependent observations are scarce. To bridge this gap, we study theoretical properties of deep neural networks on modeling non-linear time series data. Specifically, non-asymptotic bounds for prediction error of (sparse) feed-forward neural network with ReLU activation function is established under mixing-type assumptions. These assumptions are mild such that they include a wide range of time series models including auto-regressive models. Compared to independent observations, established convergence rates have additional logarithmic factors to compensate for additional complexity due to dependence among data points. The theoretical results are supported via various numerical simulation settings as well as an application to a macroeconomic data set.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2022

Tucker tensor factor models for high-dimensional higher-order tensor observations

Higher-order tensor data are prevailing in a wide range of fields includ...
research
02/01/2023

Deep learning for ψ-weakly dependent processes

In this paper, we perform deep neural networks for learning ψ-weakly dep...
research
02/15/2023

Excess risk bound for deep learning under weak dependence

This paper considers deep neural networks for learning weakly dependent ...
research
06/28/2022

Optimal Estimation of Generic Dynamics by Path-Dependent Neural Jump ODEs

This paper studies the problem of forecasting general stochastic process...
research
06/26/2020

Covariance-engaged Classification of Sets via Linear Programming

Set classification aims to classify a set of observations as a whole, as...
research
03/16/2018

Deep Component Analysis via Alternating Direction Neural Networks

Despite a lack of theoretical understanding, deep neural networks have a...
research
07/01/2023

Sparsity-aware generalization theory for deep neural networks

Deep artificial neural networks achieve surprising generalization abilit...

Please sign up or login with your details

Forgot password? Click here to reset