Convergence rates of deep ReLU networks for multiclass classification

08/02/2021
by   Thijs Bos, et al.
0

For classification problems, trained deep neural networks return probabilities of class memberships. In this work we study convergence of the learned probabilities to the true conditional class probabilities. More specifically we consider sparse deep ReLU network reconstructions minimizing cross-entropy loss in the multiclass classification setup. Interesting phenomena occur when the class membership probabilities are close to zero. Convergence rates are derived that depend on the near-zero behaviour via a margin-type condition.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/10/2018

Fast convergence rates of deep neural networks for classification

We derive the fast convergence rates of a deep neural network (DNN) clas...
research
07/31/2023

Classification with Deep Neural Networks and Logistic Loss

Deep neural networks (DNNs) trained with the logistic loss (i.e., the cr...
research
05/22/2023

Multiclass classification for multidimensional functional data through deep neural networks

The intrinsically infinite-dimensional features of the functional observ...
research
08/15/2023

Classification of Data Generated by Gaussian Mixture Models Using Deep ReLU Networks

This paper studies the binary classification of unbounded data from ℝ^d ...
research
08/02/2019

Deep ReLU network approximation of functions on a manifold

Whereas recovery of the manifold from data is a well-studied topic, appr...
research
07/25/2022

Optimal Convergence Rates of Deep Neural Networks in a Classification Setting

We establish optimal convergence rates up to a log-factor for a class of...
research
06/11/2020

Directional convergence and alignment in deep learning

In this paper, we show that although the minimizers of cross-entropy and...

Please sign up or login with your details

Forgot password? Click here to reset