Deep learning is adaptive to intrinsic dimensionality of model smoothness in anisotropic Besov space

10/28/2019
by   Taiji Suzuki, et al.
11

Deep learning has exhibited superior performance for various tasks, especially for high-dimensional datasets, such as images. To understand this property, we investigate the approximation and estimation ability of deep learning on anisotropic Besov spaces. The anisotropic Besov space is characterized by direction-dependent smoothness and includes several function classes that have been investigated thus far. We demonstrate that the approximation error and estimation error of deep learning only depend on the average value of the smoothness parameters in all directions. Consequently, the curse of dimensionality can be avoided if the smoothness of the target function is highly anisotropic. Unlike existing studies, our analysis does not require a low-dimensional structure of the input data. We also investigate the minimax optimality of deep learning and compare its performance with that of the kernel method (more generally, linear estimators). The results show that deep learning has better dependence on the input dimensionality if the target function possesses anisotropic smoothness, and it achieves an adaptive rate for functions with spatially inhomogeneous smoothness.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/18/2018

Adaptivity of deep ReLU network for learning in Besov and mixed smooth Besov spaces: optimal rate and curse of dimensionality

Deep learning has shown high performances in various types of tasks from...
research
09/23/2020

Estimation error analysis of deep learning on the regression problem on the variable exponent Besov space

Deep learning has achieved notable success in various fields, including ...
research
02/27/2023

Nonparametric regression for repeated measurements with deep neural networks

Analysis of repeated measurements for a sample of subjects has been inte...
research
05/25/2023

How many samples are needed to leverage smoothness?

A core principle in statistical learning is that smoothness of target fu...
research
06/05/2020

Expressivity of expand-and-sparsify representations

A simple sparse coding mechanism appears in the sensory systems of sever...
research
07/01/2015

Bigeometric Organization of Deep Nets

In this paper, we build an organization of high-dimensional datasets tha...
research
05/22/2019

On the minimax optimality and superiority of deep neural network learning over sparse parameter spaces

Deep learning has been applied to various tasks in the field of machine ...

Please sign up or login with your details

Forgot password? Click here to reset