On the rate of convergence of a deep recurrent neural network estimate in a regression problem with dependent data

10/31/2020
by   Michael Kohler, et al.
0

A regression problem with dependent data is considered. Regularity assumptions on the dependency of the data are introduced, and it is shown that under suitable structural assumptions on the regression function a deep recurrent neural network estimate is able to circumvent the curse of dimensionality.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/26/2019

Think Again Networks and the Delta Loss

This short paper introduces an abstraction called Think Again Networks (...
research
04/21/2019

Estimating Forces of Robotic Pouring Using a LSTM RNN

In machine learning, it is very important for a robot to be able to esti...
research
12/09/2019

Analysis of the rate of convergence of neural network regression estimates which are easy to implement

Recent results in nonparametric regression show that for deep learning, ...
research
02/06/2018

Structural Recurrent Neural Network (SRNN) for Group Activity Analysis

A group of persons can be analyzed at various semantic levels such as in...
research
07/08/2022

Spatial Econometrics for Misaligned Data

We produce methodology for regression analysis when the geographic locat...
research
04/17/2019

Regression and Classification for Direction-of-Arrival Estimation with Convolutional Recurrent Neural Networks

We present a novel learning-based approach to estimate the direction-of-...
research
08/02/2022

Analog Gated Recurrent Neural Network for Detecting Chewing Events

We present a novel gated recurrent neural network to detect when a perso...

Please sign up or login with your details

Forgot password? Click here to reset