Deep Learning and MARS: A Connection

08/29/2019
by   Michael Kohler, et al.
0

We consider least squares regression estimates using deep neural networks. We show that these estimates satisfy an oracle inequality, which implies that (up to a logarithmic factor) the error of these estimates is at least as small as the optimal possible error bound which one would expect for MARS in case that this procedure would work in the optimal way. As a result we show that our neural networks are able to achieve a dimensionality reduction in case that the regression function locally has low dimensionality. This assumption seems to be realistic in real-world applications, since selected high-dimensional data are often confined to locally-low-dimensional distributions. In our simulation study we provide numerical experiments to support our theoretical results and to compare our estimate with other conventional nonparametric regression estimates, especially with MARS. The use of our estimates is illustrated through a real data analysis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/14/2021

Deep Nonparametric Regression on Approximately Low-dimensional Manifolds

In this paper, we study the properties of nonparametric least squares re...
research
10/15/2021

Fast Partial Quantile Regression

Partial least squares (PLS) is a dimensionality reduction technique used...
research
07/21/2021

Robust Nonparametric Regression with Deep Neural Networks

In this paper, we study the properties of robust nonparametric estimatio...
research
12/09/2019

Analysis of the rate of convergence of neural network regression estimates which are easy to implement

Recent results in nonparametric regression show that for deep learning, ...
research
11/25/2017

An Oracle Property of The Nadaraya-Watson Kernel Estimator for High Dimensional Nonparametric Regression

The celebrated Nadaraya-Watson kernel estimator is among the most studie...
research
07/20/2021

The Smoking Gun: Statistical Theory Improves Neural Network Estimates

In this paper we analyze the L_2 error of neural network regression esti...
research
06/26/2020

Nearest Neighbour Based Estimates of Gradients: Sharp Nonasymptotic Bounds and Applications

Motivated by a wide variety of applications, ranging from stochastic opt...

Please sign up or login with your details

Forgot password? Click here to reset