Minimax Optimal Deep Neural Network Classifiers Under Smooth Decision Boundary

07/04/2022
by   Tianyang Hu, et al.
0

Deep learning has gained huge empirical successes in large-scale classification problems. In contrast, there is a lack of statistical understanding about deep learning methods, particularly in the minimax optimality perspective. For instance, in the classical smooth decision boundary setting, existing deep neural network (DNN) approaches are rate-suboptimal, and it remains elusive how to construct minimax optimal DNN classifiers. Moreover, it is interesting to explore whether DNN classifiers can circumvent the curse of dimensionality in handling high-dimensional data. The contributions of this paper are two-fold. First, based on a localized margin framework, we discover the source of suboptimality of existing DNN approaches. Motivated by this, we propose a new deep learning classifier using a divide-and-conquer technique: DNN classifiers are constructed on each local region and then aggregated to a global one. We further propose a localized version of the classical Tsybakov's noise condition, under which statistical optimality of our new classifier is established. Second, we show that DNN classifiers can adapt to low-dimensional data structures and circumvent the curse of dimensionality in the sense that the minimax rate only depends on the effective dimension, potentially much smaller than the actual data dimension. Numerical experiments are conducted on simulated data to corroborate our theoretical results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/04/2023

Minimax optimal high-dimensional classification using deep neural networks

High-dimensional classification is a fundamentally important research pr...
research
12/10/2018

Fast convergence rates of deep neural networks for classification

We derive the fast convergence rates of a deep neural network (DNN) clas...
research
01/19/2020

Optimal Rate of Convergence for Deep Neural Network Classifiers under the Teacher-Student Setting

Classifiers built with neural networks handle large-scale high-dimension...
research
07/04/2019

Adaptive Approximation and Estimation of Deep Neural Network to Intrinsic Dimensionality

We theoretically prove that the generalization performance of deep neura...
research
10/01/2020

Ray-based classification framework for high-dimensional data

While classification of arbitrary structures in high dimensions may requ...
research
05/22/2019

On the minimax optimality and superiority of deep neural network learning over sparse parameter spaces

Deep learning has been applied to various tasks in the field of machine ...
research
12/07/2019

Deep Variable-Block Chain with Adaptive Variable Selection

The architectures of deep neural networks (DNN) rely heavily on the unde...

Please sign up or login with your details

Forgot password? Click here to reset