Optimal Convergence Rates of Deep Neural Networks in a Classification Setting

07/25/2022
by   Joseph T. Meyer, et al.
0

We establish optimal convergence rates up to a log-factor for a class of deep neural networks in a classification setting under a restraint sometimes referred to as the Tsybakov noise condition. We construct classifiers in a general setting where the boundary of the bayes-rule can be approximated well by neural networks. Corresponding rates of convergence are proven with respect to the misclassification error. It is then shown that these rates are optimal in the minimax sense if the boundary satisfies a smoothness condition. Non-optimal convergence rates already exist for this setting. Our main contribution lies in improving existing rates and showing optimality, which was an open problem. Furthermore, we show almost optimal rates under some additional restraints which circumvent the curse of dimensionality. For our analysis we require a condition which gives new insight on the restraint used. In a sense it acts as a requirement for the "correct noise exponent" for a class of functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/04/2023

Optimal rates of approximation by shallow ReLU^k neural networks and applications to nonparametric regression

We study the approximation capacity of some variation spaces correspondi...
research
07/31/2023

Classification with Deep Neural Networks and Logistic Loss

Deep neural networks (DNNs) trained with the logistic loss (i.e., the cr...
research
11/07/2016

Optimal rates for the regularized learning algorithms under general source condition

We consider the learning algorithms under general source condition with ...
research
02/28/2021

Optimal Imperfect Classification for Gaussian Functional Data

Existing works on functional data classification focus on the constructi...
research
06/11/2019

Fast Rates for a kNN Classifier Robust to Unknown Asymmetric Label Noise

We consider classification in the presence of class-dependent asymmetric...
research
06/16/2022

On Error and Compression Rates for Prototype Rules

We study the close interplay between error and compression in the non-pa...
research
08/02/2021

Convergence rates of deep ReLU networks for multiclass classification

For classification problems, trained deep neural networks return probabi...

Please sign up or login with your details

Forgot password? Click here to reset