On sample complexity of neural networks

10/24/2019
by   Alexander Usvyatsov, et al.
0

We consider functions defined by deep neural networks as definable objects in an o-miminal expansion of the real field, and derive an almost linear (in the number of weights) bound on sample complexity of such networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/25/2023

Initialization-Dependent Sample Complexity of Linear Predictors and Neural Networks

We provide several new results on the sample complexity of vector-valued...
research
05/28/2023

On the Role of Noise in the Sample Complexity of Learning Recurrent Neural Networks: Exponential Gaps for Long Sequences

We consider the class of noisy multi-layered sigmoid recurrent neural ne...
research
07/18/2022

On the Study of Sample Complexity for Polynomial Neural Networks

As a general type of machine learning approach, artificial neural networ...
research
02/12/2018

On the Sample Complexity of Learning from a Sequence of Experiments

We analyze the sample complexity of a new problem: learning from a seque...
research
08/14/2020

On the Sample Complexity of Super-Resolution Radar

We point out an issue with Lemma 8.6 of [1]. This lemma specifies the re...
research
11/25/2022

Automata Cascades: Expressivity and Sample Complexity

Every automaton can be decomposed into a cascade of basic automata. This...
research
03/20/2020

Sample Complexity Result for Multi-category Classifiers of Bounded Variation

We control the probability of the uniform deviation between empirical an...

Please sign up or login with your details

Forgot password? Click here to reset