Adaptive Approximation and Estimation of Deep Neural Network to Intrinsic Dimensionality

07/04/2019
by   Ryumei Nakada, et al.
0

We theoretically prove that the generalization performance of deep neural networks (DNNs) is mainly determined by an intrinsic low-dimensional structure of data. Recently, DNNs empirically provide outstanding performance in various machine learning applications. Motivated by the success, theoretical properties of DNNs (e.g. a generalization error) are actively investigated by numerous studies toward understanding their mechanism. Especially, how DNNs behave with high-dimensional data is one of the most important concerns. However, the problem is not sufficiently investigated from an aspect of characteristics of data, despite it is frequently observed that high-dimensional data have an intrinsic low-dimensionality in practice. In this paper, to clarify a connection between DNNs and such the data, we derive bounds for approximation and generalization errors by DNNs with intrinsic low-dimensional data. To the end, we introduce a general notion of an intrinsic dimension and develop a novel proof technique to evaluate the errors. Consequently, we show that convergence rates of the errors by DNNs do not depend on the nominal high-dimensionality of data, but depend on the lower intrinsic dimension. We also show that the rate is optimal in the minimax sense. Furthermore, we find that DNNs with increasing layers can handle a broader class of intrinsic low-dimensional data. We conduct a numerical simulation to validate (i) the intrinsic dimension of data affects the generalization error of DNNs, and (ii) DNNs outperform other non-parametric estimators which are also adaptive to the intrinsic dimension.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/19/2011

k-NN Regression Adapts to Local Intrinsic Dimension

Many nonparametric regressors were recently shown to converge at rates t...
research
04/18/2021

The Intrinsic Dimension of Images and Its Impact on Learning

It is widely believed that natural image data exhibits low-dimensional s...
research
02/13/2018

Deep Neural Networks Learn Non-Smooth Functions Effectively

We theoretically discuss why deep neural networks (DNNs) performs better...
research
06/02/2019

Dimensionality compression and expansion in Deep Neural Networks

Datasets such as images, text, or movies are embedded in high-dimensiona...
research
07/04/2022

Minimax Optimal Deep Neural Network Classifiers Under Smooth Decision Boundary

Deep learning has gained huge empirical successes in large-scale classif...
research
02/12/2023

Deep Neural Networks for Nonparametric Interaction Models with Diverging Dimension

Deep neural networks have achieved tremendous success due to their repre...
research
01/01/2022

Deep Nonparametric Estimation of Operators between Infinite Dimensional Spaces

Learning operators between infinitely dimensional spaces is an important...

Please sign up or login with your details

Forgot password? Click here to reset